mem0ai / mem0
- ัะตัะฒะตัะณ, 18 ะธัะปั 2024โฏะณ. ะฒ 00:00:01
The memory layer for Personalized AI
Mem0 provides a smart, self-improving memory layer for Large Language Models, enabling personalized AI experiences across applications.
Note: The Mem0 repository now also includes the Embedchain project. We continue to maintain and support Embedchain โค๏ธ. You can find the Embedchain codebase in the embedchain directory.
pip install mem0ai
from mem0 import Memory
# Initialize Mem0
m = Memory()
# Store a memory from any unstructured text
result = m.add("I am working on improving my tennis skills. Suggest some online courses.", user_id="alice", metadata={"category": "hobbies"})
print(result)
# Created memory: Improving her tennis skills. Looking for online suggestions.
# Retrieve memories
all_memories = m.get_all()
print(all_memories)
# Search memories
related_memories = m.search(query="What are Alice's hobbies?", user_id="alice")
print(related_memories)
# Update a memory
result = m.update(memory_id="m1", data="Likes to play tennis on weekends")
print(result)
# Get memory history
history = m.history(memory_id="m1")
print(history)
For detailed usage instructions and API reference, visit our documentation at docs.mem0.ai.
For production environments, you can use Qdrant as a vector store:
from mem0 import Memory
config = {
"vector_store": {
"provider": "qdrant",
"config": {
"host": "localhost",
"port": 6333,
}
},
}
m = Memory.from_config(config)
Join our Slack or Discord community for support and discussions. If you have any questions, feel free to reach out to us using one of the following methods: