GitHub - simple10/agent-super-spy: Full stack observability for local AI agent development. Provides a llm-proxy to use as LLM provider base URL, logs traces to opik.

5 min read Original article ↗

A local, all-in-one LLM observability stack that combines Opik, Phoenix, and mitmproxy around a generic LLM proxy that works with any provider.

Point your agents, SDKs, or tools at the LLM proxy and get full visibility into every API call.

Provider API keys can be optionally configured at the llm-proxy level.

What's Included

Default stack (always running):

  • llm-proxy — Generic LLM API reverse proxy with path-based routing, API key management, and configurable OTEL trace export
  • mitmproxy — Transparent HTTPS proxy with web UI for raw traffic inspection

Optional:

  • phoenix — Phoenix UI and OTEL collector (phoenix profile)
  • Opik — Trace/span visualization and analysis UI
  • claude-sdk-chat — Claude Agent SDK chat UI (claude-chat profile)
  • claude-sdk-proxy — Anthropic API proxy with caching (claude-proxy profile)
  • claude-code — Claude Code CLI container (claude-code profile)

Quick Start

# 1. Interactive setup (generates .env and keys.jsonc)
./setup.sh

# 2. Start everything
./start.sh

# 3. Open the UIs
#    Opik:      http://localhost:5173
#    mitmproxy: http://localhost:8081/?token=mitmpass

How It Works

Your agents / SDKs / tools
        |
        |  ANTHROPIC_BASE_URL=http://localhost:4000/anthropic
        |  OPENAI_BASE_URL=http://localhost:4000/openai
        |  http://localhost:4000/<any-hostname>/path
        |
        v
+- llm-proxy-net -------------------------------------------------------+
|                                                                       |
|  +-------------+    +-----------+    +------------------+             |
|  |  llm-proxy  |--->| mitmproxy |--->| upstream APIs    |             |
|  |  :4000      |    | :8081 UI  |    | anthropic, openai|             |
|  |  key swap + |    |           |    | openrouter, etc. |             |
|  |  opik log   |    +-----------+    +------------------+             |
|  +------+------+                                                      |
|         | traces                                                      |
|         v                                                             |
|  +---------------------------------------------+                      |
|  |  Phoenix / Opik                             |                      |
|  |  - trace/span visualization                 |                      |
|  +---------------------------------------------+                      |
+-----------------------------------------------------------------------+

How To Use It

Agent Super Spy was created to solve one of the biggest pain points of local agent development:

How to see exactly what LLM calls and network traffic is generated by the agents.

The mitmproxy can be used to capture all network traffic for a container or used as a passthrough proxy for local development. Point your HTTP client at the proxy:

# Route all traffic through mitmproxy for inspection
export HTTP_PROXY=http://localhost:8080
export HTTPS_PROXY=http://localhost:8080

# Or route only LLM calls through the llm-proxy (exports traces via OTEL)
export ANTHROPIC_BASE_URL=http://localhost:4000/anthropic
export OPENAI_BASE_URL=http://localhost:4000/openai

For HTTPS inspection, install the mitmproxy CA certificate (auto-generated on first run) so your client trusts the intercepted connections. See the mitmproxy docs for details.

For LLM logging, the llm-proxy transparently exports traces to the configured OTEL targets and also sends traffic through the mitmproxy. Opik or Phoenix lets you inspect traces. Mitmproxy lets you see the exact headers and request payloads.

The optional claude services (containers) act as reference implementations for how to run agents in Docker containers and auto capture all outbound traffic.

The general use pattern:

  1. Start the stack
  2. Configure your agents (local or in other containers) to use the llm-proxy as the base URL
  3. View traces in Opik or Phoenix and raw traffic in mitmproxy web UI

LLM Proxy Routing

The proxy uses path-based routing. Set your SDK's base URL to route through the proxy:

Provider Base URL Example
Anthropic http://localhost:4000/anthropic /anthropic/v1/messages
OpenAI http://localhost:4000/openai /openai/v1/chat/completions
Any hostname http://localhost:4000/<hostname> /api.openrouter.com/v1/messages

Generic routing: If the first path segment contains a ., it's treated as a hostname and forwarded to https://<hostname>/remaining/path.

API Key Management

The proxy supports transparent API key swapping via keys.jsonc:

How it works:

  1. Your SDK sends requests with x-api-key: my-local-key (or Authorization: Bearer my-local-key)
  2. The proxy looks up my-local-key in keys.jsonc
  3. Finds the real key for the target provider
  4. Swaps it before forwarding to the upstream API

If the key isn't found in keys.jsonc, it's passed through as-is (assumed to be a real key).

Reload keys without restarting: docker kill --signal=HUP <llm-proxy-container>

Connecting Other Projects

Add the shared network to your project's docker-compose.yml:

networks:
  llm-proxy-net:
    name: llm-proxy-net    # or your configured NETWORK_NAME
    external: true

services:
  my-agent:
    networks:
      - default
      - llm-proxy-net
    environment:
      ANTHROPIC_BASE_URL: http://llm-proxy:4000/anthropic

From the shared network, these host names are available:

  • llm-proxy:4000 — LLM proxy
  • mitmproxy-ui:8081 — mitmproxy web UI
  • phoenix:6006 - Phoenix OTEL collector (if Phoenix enabled)
  • opik-backend:8080 — Opik API (if Opik enabled)
  • opik-frontend:5173 — Opik UI (if Opik enabled)

Configuration

Variable Default Description
COMPOSE_PROJECT_NAME llm-stack Docker compose project name
NETWORK_NAME llm-proxy-net Shared Docker network name
LLM_PROXY_PORT 4000 LLM proxy host port
TRACE_EXPORTERS phoenix Comma-delimited trace exporters: opik, phoenix
OPIK_OTEL_ENDPOINT http://opik-frontend:5173/api/v1/private/otel/v1/traces Opik OTLP traces endpoint
OTEL_PROJECT_NAME llm-proxy Shared project name for Opik and Phoenix traces
PHOENIX_COLLECTOR_ENDPOINT http://phoenix:6006 Phoenix collector base URL
MITMPROXY_UI_PORT 8081 mitmproxy web UI host port
MITMPROXY_WEB_PASSWORD mitmpass mitmproxy web UI password
COMPOSE_PROFILES Optional services: phoenix, claude-chat, claude-proxy, claude-code
ANTHROPIC_API_KEY For optional Claude services

Stopping

./stop.sh          # Stop all services, keep network
./stop.sh --clean  # Stop all services and remove network

Additional Resources