langwatch / langwatch
- суббота, 3 мая 2025 г. в 00:00:07
The open LLM Ops platform - Traces, Analytics, Evaluations, Datasets and Prompt Optimization ✨
LangWatch is an open platform for Observing, Evaluating and Optimizing your LLM and Agentic applications. We offer:
LangWatch is framework- and LLM-agnostic, with support for (LangGraph, DSPy, Langflow, Flowise, and others) and LLM providers (OpenAI, Azure, Bedrock, Gemini, Deepseek, Groq, MistralAI, VertexAI, LiteLLM, and others) via OpenTelemetry.
Our use of open, and community-driven standards is all about supporting your business decisions, and enabling your teams to be flexible to the rapidly changing AI ecosystem without worrying about compatibility.
Get up and running in under 30 seconds. The example below uses Docker, but if you prefer helm charts check out the instructions here.
git clone https://github.com/langwatch/langwatch.git
cp langwatch/.env.example langwatch/.env
docker compose up -d --wait --build
open http://localhost:5560
You'll be launched right into our onboarding flow. Welcome aboard 🫡.
The easiest way to get started with LangWatch is via our Cloud offering. Create a free account to get started.
Read more about it on our docs.
Read more about it on our docs.
Start just the databases using docker and leave it running:
docker compose up redis postgres opensearch
Then, on another terminal, install the dependencies and start LangWatch:
make install
make start
Get observing in minutes. Now you have an account and have created a project inside LangWatch, lets get your messages flowing through LangWatch.
Note
Not using Python or OpenAI? Don't worry, we have your back . Visit our docs for full guides for other popular languages, LLM providers, and frameworks.
Available for install via pip
, or uv
. The SDK will also check your environment variables by default for your API key and endpoint.
pip install langwatch
LANGWATCH_API_KEY="sk-lw-..."
# This is only needed if you aren't using LangWatch Cloud.
# LANGWATCH_ENDPOINT="https://self-hosted-url.internal/"
import langwatch
from openai import OpenAI
client = OpenAI()
@langwatch.trace()
def main():
langwatch.get_current_trace().autotrack_openai_calls(client)
...
You can also view a public share of the trace here.
LangWatch builds and maintains several integrations listed below, additionally our tracing platform is built on top of OpenTelemetry, so we support any OpenTelemetry compatible library out of the box.
We also support various community standards, such as OpenInference, OpenLLMetry, and more.
Our Python SDK supports the following auto-instrumentors.
Though OpenTelemetry, we also support all the frameworks and providers that support them, such as:
You can find a full guide on our docs.
Our JavaScript SDK supports the following instrumentors:
Are you using a platform that could benefit from a direct LangWatch integration? We'd love to hear from you, please fill out this very quick form.
Have questions or need help? We're here to support you in multiple ways:
Contributions are what make the open-source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
Please read our Contribution Guidelines for details on our code of conduct, and the process for submitting pull requests.
Please read our LICENSE.md file.
As a platform that has access to data that is highly likely to be be sensitive, we take security incredibly seriously and treat it as a core part of our culture.
Legal Framework | Current Status |
---|---|
GDPR | Compliant. DPA available upon request. |
ISO 27001 | Certified. Certification report available upon request on our Enterprise plan. |
Please refer to our Security page for more information. Contact us at security@langwatch.ai if you have any further questions.
If you need to do a responsible disclosure of a security vulnerability, you may do so by email to security@langwatch.ai, or if you prefer you can reach out to one of our team privately on Discord.