pip install "mellea[telemetry]", Ollama running locally.
Mellea provides built-in OpenTelemetry instrumentation.
Two independent trace scopes can be enabled separately, and a metrics API lets you
collect counters and histograms alongside traces. All telemetry is opt-in — if the
[telemetry] extra is not installed, every telemetry call is a silent no-op.
Note: OpenTelemetry is an optional dependency. Mellea works normally without it. Install withpip install "mellea[telemetry]"oruv pip install "mellea[telemetry]".
Configuration
All telemetry is configured via environment variables:| Variable | Description | Default |
|---|---|---|
MELLEA_TRACE_APPLICATION | Enable application-level tracing | false |
MELLEA_TRACE_BACKEND | Enable backend-level tracing | false |
MELLEA_TRACE_CONSOLE | Print traces to console (debugging) | false |
MELLEA_METRICS_ENABLED | Enable metrics collection | false |
MELLEA_METRICS_CONSOLE | Print metrics to console (debugging) | false |
OTEL_EXPORTER_OTLP_ENDPOINT | OTLP endpoint for trace and metric export | none |
OTEL_SERVICE_NAME | Service name in exported telemetry | mellea |
Trace scopes
Mellea has two independent trace scopes:mellea.application— user-facing operations: session lifecycle,@generativefunction calls,instruct()andact()calls, sampling strategies, and requirement validation.mellea.backend— LLM backend interactions, following the OpenTelemetry Gen-AI Semantic Conventions. Records model calls, token usage, finish reasons, and API latency.
Using start_session() as a context manager
Wrapping a session in with start_session() ties the trace lifecycle to the session
scope. All spans generated within the block are nested under the session span:
Debugging with console output
Print spans directly to stdout without configuring an OTLP backend:Exporting to an OTLP backend
Any OTLP-compatible backend works. To export to a local Jaeger instance:Checking trace status programmatically
Metrics
The metrics API exposes counters, histograms, and up-down counters backed by the OpenTelemetry Metrics API. Enable metrics collection:create_counter and create_histogram to instrument your own code:
MELLEA_METRICS_ENABLED is false or the [telemetry] extra is not installed,
all instrument calls are no-ops with no overhead.
Note: Metrics are exported toOTEL_EXPORTER_OTLP_ENDPOINTwhen set. If metrics are enabled but no endpoint is configured andMELLEA_METRICS_CONSOLEis alsofalse, Mellea will log a warning at startup.
Span hierarchy
When both trace scopes are enabled, spans nest as follows:| Attribute | Description |
|---|---|
gen_ai.system | LLM provider name (openai, ollama, huggingface) |
gen_ai.request.model | Model requested |
gen_ai.response.model | Model actually used (may differ) |
gen_ai.usage.input_tokens | Input tokens consumed |
gen_ai.usage.output_tokens | Output tokens generated |
gen_ai.response.finish_reasons | Finish reason list (e.g., ["stop"]) |
| Attribute | Description |
|---|---|
mellea.backend | Backend class name |
mellea.action_type | Component type being executed |
sampling_success | Whether sampling succeeded |
num_generate_logs | Number of generation attempts |
response | Model response (truncated to 500 chars) |
Full example: docs/examples/telemetry/telemetry_example.py