github

langwatch / langwatch

  • суббота, 3 мая 2025 г. в 00:00:07
https://github.com/langwatch/langwatch

The open LLM Ops platform - Traces, Analytics, Evaluations, Datasets and Prompt Optimization ✨



logo.webp

Discord LangWatch Python SDK version LangWatch TypeScript SDK version

LangWatch is an open platform for Observing, Evaluating and Optimizing your LLM and Agentic applications. We offer:

  • Observability - Add tracing to your LLM, based on the OpenTelemetry standard, allowing LangWatch to provide real-time powers.
  • Evaluation - Run real-time and offline evaluations against production or synthetic datasets. Compare performance across various matrixes (prompts, modules, hosting providers, and complete LLM pipelines).
  • Datasets - Automatically created from traced messages, or upload, datasets which can be used across the platform for evals.
  • Optimization Studio – Build and run evaluations faster with our no+lo 🧘 code studio. If you need more flexibility, you have access to the DSL that powers it too.
  • Prompt Management & Optimization – Version prompts, test changes on datasets, and improve real-world performance. Auto-optimize with DSPy's MIPROv2 to generate better prompts and few-shot examples.
  • Annotations - Human in the Loop, done right. Accelerate better data creation by combining domain expert input with smart workflows. Use LangWatch's intuitive annotation interface to collaborate directly with experts while keeping full control over your code. Quickly generate high-quality labels, catch edge cases, and fine-tune datasets to build more accurate, robust AI models.

LangWatch is framework- and LLM-agnostic, with support for (LangGraph, DSPy, Langflow, Flowise, and others) and LLM providers (OpenAI, Azure, Bedrock, Gemini, Deepseek, Groq, MistralAI, VertexAI, LiteLLM, and others) via OpenTelemetry.

Our use of open, and community-driven standards is all about supporting your business decisions, and enabling your teams to be flexible to the rapidly changing AI ecosystem without worrying about compatibility.

🚢 Deploying LangWatch

Local setup 💻

Get up and running in under 30 seconds. The example below uses Docker, but if you prefer helm charts check out the instructions here.

git clone https://github.com/langwatch/langwatch.git
cp langwatch/.env.example langwatch/.env
docker compose up -d --wait --build
open http://localhost:5560

You'll be launched right into our onboarding flow. Welcome aboard 🫡.

Cloud ☁️

The easiest way to get started with LangWatch is via our Cloud offering. Create a free account to get started.

Other Flavours 🍦

Self-hosted (OnPrem) ⚓️ LangWatch offers a fully self-hosted version of the platform for companies that require strict data control and compliance, complete with Azure AD support.

Read more about it on our docs.

Hybrid (OnPrem data) 🔀 LangWatch offers a hybrid setup for companies that have strict data control and compliance requirements, without needing to go fully on-prem.

Read more about it on our docs.

Local Development 👩‍💻 You can also run LangWatch locally without docker to develop and help contribute to the project.

Start just the databases using docker and leave it running:

docker compose up redis postgres opensearch

Then, on another terminal, install the dependencies and start LangWatch:

make install
make start

🚀 Quick Start

Get observing in minutes. Now you have an account and have created a project inside LangWatch, lets get your messages flowing through LangWatch.

Note

Not using Python or OpenAI? Don't worry, we have your back . Visit our docs for full guides for other popular languages, LLM providers, and frameworks.

Install and configure SDK

Available for install via pip, or uv. The SDK will also check your environment variables by default for your API key and endpoint.

pip install langwatch
LANGWATCH_API_KEY="sk-lw-..."

# This is only needed if you aren't using LangWatch Cloud.
# LANGWATCH_ENDPOINT="https://self-hosted-url.internal/"

Create your first trace and auto-instrument OpenAI

import langwatch
from openai import OpenAI

client = OpenAI()

@langwatch.trace()
def main():
    langwatch.get_current_trace().autotrack_openai_calls(client)
    ...

See your traces in LangWatch

A view of a trace in the LangWatch app

You can also view a public share of the trace here.

🗺️ Integrations

LangWatch builds and maintains several integrations listed below, additionally our tracing platform is built on top of OpenTelemetry, so we support any OpenTelemetry compatible library out of the box.

We also support various community standards, such as OpenInference, OpenLLMetry, and more.

Python 🐍

Our Python SDK supports the following auto-instrumentors.

Though OpenTelemetry, we also support all the frameworks and providers that support them, such as:

  • AWS Bedrock
  • Haystack
  • CrewAI
  • Autogen
  • Grok
  • …and many more

You can find a full guide on our docs.

JavaScript ☕️

Our JavaScript SDK supports the following instrumentors:

Platforms

Are you using a platform that could benefit from a direct LangWatch integration? We'd love to hear from you, please fill out this very quick form.

🥰 Community

💬 Support

Have questions or need help? We're here to support you in multiple ways:

  • Documentation: Our comprehensive documentation covers everything from getting started to advanced features.
  • Discord Community: Join our Discord server for real-time help from our team and community.
  • GitHub Issues: Report bugs or request features through our GitHub repository.
  • Enterprise Support: Enterprise customers receive priority support with dedicated response times. Our pricing page contains more information.

🤝 Collaborating

Contributions are what make the open-source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.

Please read our Contribution Guidelines for details on our code of conduct, and the process for submitting pull requests.

✍️ License

Please read our LICENSE.md file.

👮‍♀️ Security + Compliance

As a platform that has access to data that is highly likely to be be sensitive, we take security incredibly seriously and treat it as a core part of our culture.

Legal Framework Current Status
GDPR Compliant. DPA available upon request.
ISO 27001 Certified. Certification report available upon request on our Enterprise plan.

Please refer to our Security page for more information. Contact us at security@langwatch.ai if you have any further questions.

Vulnerability Disclosure

If you need to do a responsible disclosure of a security vulnerability, you may do so by email to security@langwatch.ai, or if you prefer you can reach out to one of our team privately on Discord.