ItsPi3141 / alpaca-electron
- пятница, 14 апреля 2023 г. в 00:14:56
An even simpler way to run Alpaca
Alpaca Electron is built from the ground-up to be the easiest way to chat with the alpaca AI models. No command line or compiling needed!
Note
Download links will not be provided in this repository.
Download the latest installer from the releases page section.
Open the installer and wait for it to install.
Once done installing, it'll ask for a valid path to a model. Now, go to where you placed the model, hold shift, right click on the file, and then click on "Copy as Path". Then, paste this into that dialog box and click Confirm
.
The program will automatically restart. Now you can begin chatting!
Note
The program will also accept any other 4 bit quantized .bin model files. If you can find other .bin Alpaca model files, you can use them instead of the one recommended in the Quick Start Guide to experiment with different models. As always, be careful about what you download from the internet.
xattr -cr /Applications/Alpaca\ Electron.app/
You can either download the prebuilt binary (packaged as tar.gz), extract it and execute it with ./alpaca-electron
or build the application on yourself.
If you want to build the application yourself:
Clone the repository:
git clone https://github.com/ItsPi3141/alpaca-electron.git
Change your current directory to alpaca-electron:
cd alpaca-electron
Install application specific dependencies:
npm install --save-dev
Build the application:
npm run linux-x64
Change your current directory to the build target:
cd release-builds/alpaca-electron-linux-x64
Run the application with
./alpaca-electron
Clone the repository:
git clone https://github.com/ItsPi3141/alpaca-electron.git
Change your current directory to alpaca-electron:
cd alpaca-electron
Build the container image:
docker compose build
Run the application container:
docker compose up -d
docker compose up
(without the -d). If there is an error like Authorization required, but no authorization protocol specified
run xhost local:root
on your docker host.Credits go to @antimatter15 for creating alpaca.cpp and to @ggerganov for creating llama.cpp, the backbones behind alpaca.cpp. Finally, credits go to Meta and Stanford for creating the LLaMA and Alpaca models, respectively.
Special thanks to @keldenl for providing arm64 builds for MacOS and @W48B1T for providing Linux builds