GitHub - chunkback/chunkback: ChunkBack lets you test your apps with LLM provider endpoints without having to pay for the providers

2 min read Original article ↗

License: MIT

License: MIT TypeScript Node.js PRs Welcome GitHub Actions Workflow Status NPM Version

ChunkBack lets you test your apps with LLM provider endpoints without having to pay for the providers.

ChunkBack is a simple express server that emulates the response input and output of popular LLM providers, currently Gemini, Anthropic and OpenAI. Chunkback accepts a custom prompt language called CBPL that lets you customize the response to your applications.

Quick Start

Requires at a least Node.js 22 or greater installed.

Then run in your terminal:

curl -X POST http://localhost:5654/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "swag",
    "messages": [
      {"role": "user", "content": "SAY \"First message\"\nCHUNKSIZE 3\nCHUNKLATENCY 50\nSAY \"Second message\""}
    ]
  }'

Demo

hello.mov

Why use ChunkBack?

  • Deterministic API response - You always will get back the same content you put in
  • Saves money - When testing your application, you can stub out your LLM calls with this
  • Open Source - You can see the code right there!
  • No extra services dependencies - There's no DB no Redis no nothing, just server code

CBPL - ChunkBack Prompt Langauge

ChunkBack Prompt Language is a case-sensistive langauge for controlling LLM response prompts. There's an autogenerated doc here for usage.

My favorite test script is:

CHUNKSIZE 3
RANDOMLATENCY 50 200
SAY "Each chunk has a random delay between 50-200ms"

How do I integrate this with my app?

OpenAI

Set your OPENAI_API_BASE environment variable to http://localhost:5654 and your app should work out of the box.

Or, just replace https://api.openai.com/v1 with http://localhost:5654/v1 in your API calls.

Anthropic

Set your ANTHROPIC_API_BASE environment variable to http://localhost:5654/v1 and your app should work out of the box.

Or, just replace https://api.anthropic.com/v1 with http://localhost:5654 in your API calls.

Gemini

Set your GOOGLE_API_BASE environment variable to http://localhost:5654 and your app should work out of the box.

Or, just replace https://generativelanguage.googleapis.com/v1beta with http://localhost:5654/v1beta in your API calls.

Hosted Version

I am hosting a version of Chunkback on api.chunkback.com (sign up on the main site chunkback.com) so you can plug it in without having to set anything up. I'm asking for 20$ a month for unlimited usage, or you can use it for free for 1000reqs/month.

Contributing

See CONTRIBUTING.md

License

MIT License - see LICENSE for details.

AI Notice

This project was built with the assistance of autogenerated coding libraries - most code was reviewed by a human (painstakingly). This README.md was handwritten, and should be kep that way.