AI You Control: Choose your models. Own your data. Eliminate vendor lock-in.
Thunderbolt
AI You Control: Choose your models. Own your data. Eliminate vendor lock-in.
Important
⚠️We are excited about the amount of interest Thunderbolt has been getting and want to clarify that it is still early and under active development. Currently, we are targeting enterprise customers that want to deploy it on-prem. We encourage you to self-host it and try it out, but there are a few caveats we are still working on:
While we eventually plan to make Thunderbolt fully offline-first, it currently depends on authentication and search functionality (though you can disable search on the integrations screen in the app). You can deploy your own backend with Docker and sign up in order to test it locally.
You’ll need to add your own model providers - we don’t yet have a public inference endpoint. We recommend using Thunderbolt with Ollama or llama.cpp if you want free local inference, or you can add API keys for any OpenAI-compatible model provider in the settings.
Thunderbolt is an open-source, cross-platform AI client that can be deployed on-prem anywhere.
🌐 Available on all major desktop and mobile platforms: web, iOS, Android, Mac, Linux, and Windows.
🧠 Compatible with frontier, local, and on-prem models.
🙋 Enterprise features, support, and FDEs available.
Thunderbolt is under active development, currently undergoing a security audit, and preparing for enterprise production readiness.
Need Help?
Found a bug? Have an idea?
We're actively working on our docs, community, and roadmap. For now, the best way to get in touch is to File an issue.
If you discover a security vulnerability, please report it responsibly via our vulnerability reporting form. Please do not file public GitHub issues for security vulnerabilities.