Settings

Theme

Show HN: Chippery, an OpenCode fork that (often) uses 20-40% fewer tokens

chippery.ai

1 points by pell 5 days ago · 0 comments · 1 min read

Reader

I kept hitting token limits with Claude Code on larger codebases and ended up building Chippery (a fork of OpenCode) to reduce context size outside the model.

It uses a symbolic index, navigation layer, semantic and Pagerank-like ranking and some context reduction / compression techniques to avoid resending and rereading the same files and lookups.

I ran benchmarks mostly with Anthropic’s models, and saw roughly 20–40% token reduction depending on workflow on average, in some cases quite a bit beyond, sometimes less.

There’s also a Claude Code hook which offers access to the tools, but it's still a bit clunky.

It’s fully open-source, with an optional paid Pro / lifetime tier for support.

No comments yet.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection