Settings

Theme

I built a unified Python library for AI batch requests (50% cost savings)

github.com

4 points by funerr 8 months ago · 4 comments

Reader

funerrOP 8 months ago

I needed a Python library to handle complex batch requests to LLMs (Anthropic & OpenAI) and couldn't find a good one - so I built one.

Batch requests take up to 24h but cut costs by ~50%. Features include structured outputs, automatic cost tracking, state resume after interruptions, and citation support (Anthropic only for now).

It's open-source, feedback/contributions welcome!

GitHub: https://github.com/agamm/batchata

tomgs 8 months ago

Neat! What’s the use case exactly? Kinda hard to figure from skimming

  • funerrOP 8 months ago

    When you have LLM requests you don't mind waiting for (up to 24h) then you can save 50% in costs. Great for document processing, image classification at scale, anything that you don't need an immediate result from the LLM provider and costs play a role.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection