cpacker / MemGPT
- ΡΡΠ΅Π΄Π°, 18 ΠΎΠΊΡΡΠ±ΡΡ 2023β―Π³. Π² 00:00:02
Teaching LLMs memory management for unbounded context ππ¦
Try out our MemGPT chatbot on Discord!
Join Discord and message the MemGPT bot (in the #memgpt
channel). Then run the following commands (messaged to "MemGPT Bot"):
/profile
(to create your profile)/key
(to enter your OpenAI key)/create
(to create a MemGPT chatbot)Make sure your privacy settings on this server are open so that MemGPT Bot can DM you:
MemGPT β Privacy Settings β Direct Messages set to ON
You can see the full list of available commands when you enter /
into the message box.
Memory-GPT (or MemGPT in short) is a system that intelligently manages different memory tiers in LLMs in order to effectively provide extended context within the LLM's limited context window. For example, MemGPT knows when to push critical information to a vector database and when to retrieve it later in the chat, enabling perpetual conversations. Learn more about MemGPT in our paper.
Install dependencies:
pip install -r requirements.txt
Add your OpenAI API key to your environment:
export OPENAI_API_KEY=YOUR_API_KEY
To run MemGPT for as a conversation agent in CLI mode, simply run main.py
:
python3 main.py
To create a new starter user or starter persona (that MemGPT gets initialized with), create a new .txt
file in /memgpt/humans/examples or /memgpt/personas/examples, then use the --persona
or --human
flag when running main.py
. For example:
# assuming you created a new file /memgpt/humans/examples/me.txt
python main.py --human me.txt
main.py
flags--persona
load a specific persona file
--human
load a specific human file
--first
allows you to send the first message in the chat (by default, MemGPT will send the first message)
--debug
enables debugging output
--archival_storage_faiss_path=<ARCHIVAL_STORAGE_FAISS_PATH>
load in document database (backed by FAISS index)
--archival_storage_files="<ARCHIVAL_STORAGE_FILES_GLOB>"
pre-load files into archival memory
--archival_storage_sqldb=<SQLDB_PATH>
load in SQL database
While using MemGPT via the CLI you can run various commands:
/exit
exit the CLI
/save
save a checkpoint of the current agent/conversation state
/load
load a saved checkpoint
/dump
view the current message log (see the contents of main context)
/memory
print the current contents of agent memory
/pop
undo the last message in the conversation
/heartbeat
send a heartbeat system message to the agent
/memorywarning
send a memory warning system message to the agent
MemGPT's archival memory let's you load your database and talk to it! To motivate this use-case, we have included a toy example.
Consider the test.db
already included in the repository.
id | name | age |
---|---|---|
1 | Alice | 30 |
2 | Bob | 25 |
3 | Charlie | 35 |
To talk to this database, run:
python main_db.py --archival_storage_sqldb=memgpt/personas/examples/sqldb/test.db
And then you can input the path to your database, and your query.
Please enter the path to the database. test.db
...
Enter your message: How old is Bob?
...
π€ Bob is 25 years old.
gpt-4
, so your API key will require gpt-4
API access.If you have any further questions, or have anything to share, we are excited to hear your feedback!
Datasets used in our paper can be downloaded at HuggingFace.