Using any-llm-gateway - JupyterLite AI

1 min read Original article ↗

any-llm-gateway is an OpenAI-compatible proxy server that provides production-grade cost controls, budget management, and usage tracking across multiple LLM providers (OpenAI, Anthropic, Google, etc.).

Setting up any-llm-gateway

  1. Installation: Install and configure the gateway following the Quick Start guide. This involves creating a directory, downloading the docker-compose.yml, and setting up your config.yml.

  2. Generate a master key:

python -c "import secrets; print(secrets.token_urlsafe(32))"
  1. Configuration: Configure your providers in config.yml with your API keys and model pricing.

  2. Start the gateway: Navigate you your gateway directory and run:

  1. Verify the gateway is running:

curl http://localhost:8000/health
# Expected: {"status": "healthy"}

Configuring jupyterlite-ai to use any-llm-gateway

Configure the Generic provider (OpenAI-compatible) with the following settings:

  • Base URL: http://localhost:8000/v1 (or your gateway server URL)

  • Model: The model name with provider prefix (e.g., openai:gpt-4, anthropic:claude-sonnet-4-5-20250929)

  • API Key: Your gateway virtual API key