Skip to content

OpenTelemetry (OTLP)

The TensorZero Gateway can export traces to an external OpenTelemetry-compatible observability system.

Exporting traces via OpenTelemetry allows you to monitor the TensorZero Gateway in external observability platforms such as Jaeger, Datadog, or Grafana. This integration enables you to correlate gateway activity with the rest of your infrastructure, providing deeper insights and unified monitoring across your systems.

Setup

  1. Enable export.otlp.traces.enabled in the [gateway] section of the tensorzero.toml configuration file:
[gateway]
# ...
export.otlp.traces.enabled = true
# ...
  1. Set the OTEL_EXPORTER_OTLP_TRACES_ENDPOINT environment variable in the gateway container to the endpoint of your OpenTelemetry service.
Example: TensorZero Gateway and Jaeger with Docker Compose

For example, if you’re deploying the TensorZero Gateway and Jaeger in Docker Compose, you can set the following environment variable:

Terminal window
services:
gateway:
image: tensorzero/gateway
environment:
OTEL_EXPORTER_OTLP_TRACES_ENDPOINT: http://jaeger:4317
# ...
jaeger:
image: jaegertracing/jaeger
ports:
- "4317:4317"
# ...

Traces

Once configured, the TensorZero Gateway will begin sending traces to your OpenTelemetry-compatible service.

Traces are generated for each HTTP request handled by the gateway (excluding auxiliary endpoints). For inference requests, these traces additionally contain spans that represent the processing of functions, variants, models, and model providers.

Screenshot of TensorZero Gateway traces in Jaeger

Example: Screenshot of a TensorZero Gateway inference request trace in Jaeger