jerryjliu / llama_index
- понедельник, 27 марта 2023 г. в 00:13:52
LlamaIndex (GPT Index) is a project that provides a central interface to connect your LLM's with external data.
⚠️ NOTE: We are rebranding GPT Index as LlamaIndex! We will carry out this transition gradually.
2/25/2023: By default, our docs/notebooks/instructions now reference "LlamaIndex" instead of "GPT Index".
2/19/2023: By default, our docs/notebooks/instructions now use the
llama-indexpackage. However thegpt-indexpackage still exists as a duplicate!
2/16/2023: We have a duplicate
llama-indexpip package. Simply replace all imports ofgpt_indexwithllama_indexif you choose topip install llama-index.
LlamaIndex (GPT Index) is a project that provides a central interface to connect your LLM's with external data.
PyPi:
Documentation: https://gpt-index.readthedocs.io/en/latest/.
Twitter: https://twitter.com/gpt_index.
Discord: https://discord.gg/dGcwcsnxhU.
LlamaHub (community library of data loaders): https://llamahub.ai
NOTE: This README is not updated as frequently as the documentation. Please check out the documentation above for the latest updates!
To perform LLM's data augmentation in a performant, efficient, and cheap manner, we need to solve two components:
That's where the LlamaIndex comes in. LlamaIndex is a simple, flexible interface between your external data and LLMs. It provides the following tools in an easy-to-use fashion:
Interesting in contributing? See our Contribution Guide for more details.
Full documentation can be found here: https://gpt-index.readthedocs.io/en/latest/.
Please check it out for the most up-to-date tutorials, how-to guides, references, and other resources!
pip install llama-index
Examples are in the examples folder. Indices are in the indices folder (see list of indices below).
To build a simple vector store index:
import os
os.environ["OPENAI_API_KEY"] = 'YOUR_OPENAI_API_KEY'
from llama_index import GPTSimpleVectorIndex, SimpleDirectoryReader
documents = SimpleDirectoryReader('data').load_data()
index = GPTSimpleVectorIndex(documents)To save to and load from disk:
# save to disk
index.save_to_disk('index.json')
# load from disk
index = GPTSimpleVectorIndex.load_from_disk('index.json')To query:
index.query("<question_text>?")The main third-party package requirements are tiktoken, openai, and langchain.
All requirements should be contained within the setup.py file. To run the package locally without building the wheel, simply run pip install -r requirements.txt.
Reference to cite if you use LlamaIndex in a paper:
@software{Liu_LlamaIndex_2022,
author = {Liu, Jerry},
doi = {10.5281/zenodo.1234},
month = {11},
title = {{LlamaIndex}},
url = {https://github.com/jerryjliu/gpt_index},
year = {2022}
}