mudler / LocalAI
- ΠΏΠΎΠ½Π΅Π΄Π΅Π»ΡΠ½ΠΈΠΊ, 22 Π°ΠΏΡΠ΅Π»Ρ 2024β―Π³. Π² 00:00:01
π€ The free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first. Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. It allows to generate Text, Audio, Video, Images. Also with voice cloning capabilities.
π‘ Get help - βFAQ πDiscussions π¬ Discord π Documentation website
π» Quickstart π£ News π« Examples πΌοΈ Models π Roadmap
LocalAI is the free, Open Source OpenAI alternative. LocalAI act as a drop-in replacement REST API thatβs compatible with OpenAI (Elevenlabs, Anthropic... ) API specifications for local AI inferencing. It allows you to run LLMs, generate images, audio (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families. Does not require GPU.
Hot topics (looking for contributors):
If you want to help and contribute, issues up for grabs: https://github.com/mudler/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3A%22up+for+grabs%22
For a detailed step-by-step introduction, refer to the Getting Started guide.
For those in a hurry, here's a straightforward one-liner to launch a LocalAI AIO(All-in-one) Image using docker
:
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-aio-cpu
# or, if you have an Nvidia GPU:
# docker run -ti --name local-ai -p 8080:8080 --gpus all localai/localai:latest-aio-gpu-nvidia-cuda-12
llama.cpp
, gpt4all.cpp
, ... π and more)whisper.cpp
)Check out the Getting started section in our documentation.
Build and deploy custom containers:
WebUIs:
Model galleries
Other:
If you utilize this repository, data in a downstream project, please consider citing it with:
@misc{localai,
author = {Ettore Di Giacinto},
title = {LocalAI: The free, Open source OpenAI alternative},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/go-skynet/LocalAI}},
Do you find LocalAI useful?
Support the project by becoming a backer or sponsor. Your logo will show up here with a link to your website.
A huge thank you to our generous sponsors who support this project:
![]() |
---|
Spectro Cloud |
Spectro Cloud kindly supports LocalAI by providing GPU and computing resources to run tests on lamdalabs! |
And a huge shout-out to individuals sponsoring the project by donating hardware or backing the project.
LocalAI is a community-driven project created by Ettore Di Giacinto.
MIT - Author Ettore Di Giacinto
LocalAI couldn't have been built without the help of great software already available from the community. Thank you!
This is a community project, a special thanks to our contributors! π€