coleam00 / Archon
- ΡΡΠ±Π±ΠΎΡΠ°, 7 ΠΈΡΠ½Ρ 2025β―Π³. Π² 00:00:06
Archon is an AI agent that is able to create other AI agents using an advanced agentic coding workflow and framework knowledge base to unlock a new frontier of automated agents.
[ V6 - Tool Library and MCP Integration ] Prebuilt tools, examples, and MCP server integration
π IMPORTANT UPDATE (March 31st): Archon now includes a library of prebuilt tools, examples, and MCP server integrations. Archon can now incorporate these resources when building new agents, significantly enhancing capabilities and reducing hallucinations. Note that the examples/tool library for Archon is just starting out. Please feel free to contribute examples, MCP servers, and prebuilt tools!
Archon is the world's first "Agenteer", an AI agent designed to autonomously build, refine, and optimize other AI agents.
It serves both as a practical tool for developers and as an educational framework demonstrating the evolution of agentic systems. Archon will be developed in iterations, starting with just a simple Pydantic AI agent that can build other Pydantic AI agents, all the way to a full agentic workflow using LangGraph that can build other AI agents with any framework. Through its iterative development, Archon showcases the power of planning, feedback loops, and domain-specific knowledge in creating robust AI agents.
The current version of Archon is V6 as mentioned above - see V6 Documentation for details.
I just created the Archon community forum over in the oTTomator Think Tank! Please post any questions you have there!
GitHub Kanban board for feature implementation and bug squashing.
Archon demonstrates three key principles in modern AI development:
Since V6 is the current version of Archon, all the code for V6 is in both the main directory and archon/iterations/v6-tool-library-integration
directory.
Note that the examples/tool library for Archon is just starting out. Please feel free to contribute examples, MCP servers, and prebuilt tools!
git clone https://github.com/coleam00/archon.git
cd archon
# This will build both containers and start Archon
python run_docker.py
Note:
run_docker.py
will automatically:
- Build the MCP server container
- Build the main Archon container
- Run Archon with the appropriate port mappings
- Use environment variables from
.env
file if it exists
git clone https://github.com/coleam00/archon.git
cd archon
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -r requirements.txt
streamlit run streamlit_ui.py
After installation, follow the guided setup process in the Intro section of the Streamlit UI:
workbench/env_vars.json
The Streamlit interface will guide you through each step with clear instructions and interactive elements. There are a good amount of steps for the setup but it goes quick!
If you encounter any errors when using Archon, please first check the logs in the "Agent Service" tab.
Logs specifically for MCP are also logged to workbench/logs.txt
(file is automatically created) so please
check there. The goal is for you to have a clear error message before creating a bug here in the GitHub repo
To get the latest updates for Archon when using Docker:
# Pull the latest changes from the repository (from within the archon directory)
git pull
# Rebuild and restart the containers with the latest changes
python run_docker.py
The run_docker.py
script will automatically:
To get the latest updates for Archon when using local Python installation:
# Pull the latest changes from the repository (from within the archon directory)
git pull
# Install any new dependencies
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -r requirements.txt
# Restart the Streamlit UI
# (If you're already running it, stop with Ctrl+C first)
streamlit run streamlit_ui.py
This ensures you're always running the most recent version of Archon with all the latest features and bug fixes.
The below diagram from the LangGraph studio is a visual representation of the Archon agent graph.
The flow works like this:
streamlit_ui.py
: Comprehensive web interface for managing all aspects of Archongraph_service.py
: FastAPI service that handles the agentic workflowrun_docker.py
: Script to build and run Archon Docker containersDockerfile
: Container definition for the main Archon applicationmcp/
: Model Context Protocol server implementation
mcp_server.py
: MCP server script for AI IDE integrationDockerfile
: Container definition for the MCP serverarchon/
: Core agent and workflow implementation
archon_graph.py
: LangGraph workflow definition and agent coordinationpydantic_ai_coder.py
: Main coding agent with RAG capabilitiesrefiner_agents/
: Specialized agents for refining different aspects of the created agent
prompt_refiner_agent.py
: Optimizes system promptstools_refiner_agent.py
: Specializes in tool implementationagent_refiner_agent.py
: Refines agent configuration and dependenciescrawl_pydantic_ai_docs.py
: Documentation crawler and processorutils/
: Utility functions and database setup
utils.py
: Shared utility functionssite_pages.sql
: Database setup commandsworkbench/
: Created at runtime, files specific to your environment
env_vars.json
: Environment variables defined in the UI are stored here (included in .gitignore, file is created automatically)logs.txt
: Low level logs for all Archon processes go herescope.md
: The detailed scope document created by the reasoner model at the start of each Archon executionThe Docker implementation consists of two containers:
Main Archon Container:
MCP Container:
When running with Docker, the run_docker.py
script automates building and starting both containers with the proper configuration.
The Supabase database uses the following schema:
CREATE TABLE site_pages (
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
url TEXT,
chunk_number INTEGER,
title TEXT,
summary TEXT,
content TEXT,
metadata JSONB,
embedding VECTOR(1536) -- Adjust dimensions as necessary (i.e. 768 for nomic-embed-text)
);
The Streamlit UI provides an interface to set up this database structure automatically.
We welcome contributions! Whether you're fixing bugs, adding features, or improving documentation, please feel free to submit a Pull Request.
For version-specific details: