Skip to main content

Getting Started with OpenTelemetry

Agenta accepts traces via the OpenTelemetry Protocol (OTLP) endpoint. You can use any OpenTelemetry-compatible instrumentation library to send traces to Agenta.

OTLP Endpoint

Agenta accepts traces via the OTLP/HTTP protocol using protobuf encoding:

Endpoint: https://cloud.agenta.ai/api/otlp/v1/traces

For self-hosted installations, replace https://cloud.agenta.ai with your instance URL.

warning

Agenta does not support gRPC for the OpenTelemetry endpoint. Please use HTTP/protobuf instead.

Authentication

Agenta uses ApiKey-based authentication for the OTLP endpoint:

headers: {
Authorization: `ApiKey ${AGENTA_API_KEY}`
}

Getting Your API Key

  1. Visit the Agenta API Keys page
  2. Click on Create New API Key and follow the prompts
  3. Copy the API key and set it as an environment variable:
export AGENTA_API_KEY="YOUR_AGENTA_API_KEY"
export AGENTA_HOST="https://cloud.agenta.ai" # Change for self-hosted

Configuration

When using OpenTelemetry SDKs directly (without the Agenta SDK), configure the OTLP exporter to point to Agenta:

OTEL_EXPORTER_OTLP_ENDPOINT="https://cloud.agenta.ai/api/otlp"
OTEL_EXPORTER_OTLP_TRACES_ENDPOINT="https://cloud.agenta.ai/api/otlp/v1/traces"
OTEL_EXPORTER_OTLP_HEADERS="Authorization=ApiKey ${AGENTA_API_KEY}"
info

If your collector requires signal-specific environment variables, use the trace-specific endpoint:

OTEL_EXPORTER_OTLP_TRACES_ENDPOINT="https://cloud.agenta.ai/api/otlp/v1/traces"
OTEL_EXPORTER_OTLP_TRACES_HEADERS="Authorization=ApiKey ${AGENTA_API_KEY}"

Supported Languages

OpenTelemetry SDKs are available for many languages:

  • Python: Use the Agenta Python SDK or OpenTelemetry SDK directly
  • Node.js / TypeScript: See the OpenTelemetry Quick Start
  • Java: Use the OpenTelemetry Java SDK
  • Go: Use the OpenTelemetry Go SDK
  • .NET: Use the OpenTelemetry .NET SDK
  • Ruby, PHP, Rust: OpenTelemetry SDKs available for all

All can send traces to Agenta using the OTLP endpoint above.

Using OpenTelemetry Instrumentation Libraries

Agenta is compatible with many OpenTelemetry instrumentation libraries that extend language and framework support. These libraries work seamlessly with Agenta's OTLP endpoint:

  • OpenLLMetry: Supports multiple LLMs (OpenAI, Anthropic, Azure, etc.) and frameworks (LangChain, LlamaIndex)
  • OpenLIT: Comprehensive instrumentation for LLMs, vector DBs, and frameworks
  • OpenInference: Arize's OpenTelemetry instrumentation for LLM applications

Framework Integrations

Many frameworks have OpenTelemetry support built-in or via plugins:

  • LangChain: OpenTelemetry instrumentation available
  • LlamaIndex: OpenTelemetry support via plugins
  • AutoGen: OpenTelemetry compatible
  • Semantic Kernel: OpenTelemetry integration available
  • Spring AI: Java framework with OpenTelemetry support

See the semantic conventions page for details on how Agenta maps OpenTelemetry attributes.

Next Steps