run-llama / sec-insights
- пятница, 8 сентября 2023 г. в 00:00:03
A real world full-stack application using LlamaIndex
SEC Insights uses the Retrieval Augmented Generation (RAG) capabilities of LlamaIndex to answer questions about SEC 10-K & 10-Q documents.
You can start using the application now at secinsights.ai
As RAG applications look to move increasingly from prototype to production, we thought our developer community would find value in having a complete example of a working real world RAG application.
SEC Insights works as well locally as it does in the cloud. It also comes with many product features that will be immediately applicable to most RAG applications.
Use this repository as a reference when building out your own RAG application or fork it entirely to start your project off with a solid foundation.
See README.md
files in frontend/
& backend/
folders for individual setup instructions for each.
We've also included a config for a Github Codespace in .devcontainer/devcontainer.json
. If you choose to use Github Codespaces, your codespace will come pre-configured with a lot of the libraries and system dependencies that are needed to run this project. This is probably the fastest way to get this project up and running!
We remain very open to contributions! We're looking forward to seeing the ideas and improvements the LlamaIndex community is able to provide.