IntegrationsOtherOpenAI Codex

Trace OpenAI Codex with Langfuse

This guide shows you how to send OpenAI Codex telemetry to Langfuse using Codex’s built-in OpenTelemetry export.

What is OpenAI Codex? Codex is OpenAI’s coding agent that can help you inspect code, edit files, run commands, and complete software engineering tasks from your local environment.

What is Langfuse? Langfuse is an open-source LLM engineering platform that helps teams trace AI applications, debug issues, monitor production behavior, and evaluate quality.

What This Integration Does

With Codex’s OpenTelemetry exporter connected to Langfuse, you can:

  • Capture Codex traces from local coding sessions
  • Inspect model and runtime spans emitted by Codex
  • Monitor latency and errors in Langfuse
  • Tag traces by environment for easier filtering
  • Optionally include prompt content when your policies allow it

The exact spans and attributes depend on the Codex version you are running, but Langfuse can ingest Codex’s OTLP/HTTP traces directly.

How It Works

Codex reads runtime configuration from either:

  • ~/.codex/config.toml for user-level configuration
  • .codex/config.toml for project-level configuration

You configure Codex to export traces via OTLP/HTTP and point it to the Langfuse OpenTelemetry endpoint at /api/public/otel/v1/traces.

Langfuse authenticates OTLP requests via Basic Auth using your Langfuse public and secret keys. Unlike some OTLP backends, you do not need separate workspace or project headers because the Langfuse project is determined by the API keys you use.

⚠️

Codex telemetry export is opt-in. Keep log_user_prompt = false unless your security and privacy policies explicitly allow prompt text to be exported.

Quick Start

Set up Langfuse

Sign up for Langfuse Cloud or self-host Langfuse. Then create a project and copy the public key and secret key from your project settings.

Create the Basic Auth header

Encode your Langfuse public and secret keys as a Basic Auth string:

echo -n "pk-lf-1234567890:sk-lf-1234567890" | base64

Use the output in your Codex configuration as:

Authorization = "Basic <base64-encoded-public-key-colon-secret-key>"

On GNU systems, you may need base64 -w 0 to avoid line wrapping for long keys.

Update your Codex configuration

Add the following to either ~/.codex/config.toml or .codex/config.toml.

[otel]
trace_exporter = "otlp-http"
environment = "prod"
log_user_prompt = false
 
[otel.trace_exporter.otlp-http]
endpoint = "https://cloud.langfuse.com/api/public/otel/v1/traces"
protocol = "binary"
headers = { "Authorization" = "Basic <base64-encoded-public-key-colon-secret-key>" }

Start a Codex session

Save the configuration, then start Codex and complete a task so it emits traces.

If you are using a project-local config, make sure you launch Codex from that project directory so .codex/config.toml is applied.

View traces in Langfuse

Open your Langfuse project and inspect the traces generated by Codex. Depending on the Codex runtime and exporter output, you can review:

  • Root traces for each Codex run
  • Nested spans for model and runtime activity
  • Environment metadata such as prod
  • Errors and timing information

Troubleshooting

No traces appear in Langfuse

Check the following:

  1. Endpoint path is correct and ends with /api/public/otel/v1/traces
  2. Authorization header is valid and starts with Basic
  3. Your Base64 value is based on public_key:secret_key
  4. You ran a new Codex session after updating config.toml
  5. You edited the correct config file for the Codex session you started

Authentication errors

  • Verify that your Langfuse public key starts with pk-lf-
  • Verify that your Langfuse secret key starts with sk-lf-
  • Confirm that your endpoint matches your Langfuse region:
    • EU Cloud: https://cloud.langfuse.com
    • US Cloud: https://us.cloud.langfuse.com

Self-hosted setup does not work

  • Confirm your Langfuse instance is reachable from the machine running Codex
  • Make sure your Langfuse deployment supports the OpenTelemetry endpoint
  • Verify TLS certificates if you are using HTTPS on a custom domain

Learn more in the Langfuse OpenTelemetry guide.

Prompt content is missing

This is expected when log_user_prompt = false.

Only set log_user_prompt = true if exporting prompt text is allowed by your internal policies.

Resources

Was this page helpful?