Tamp: Cut LLM context size ~50% without changing your code
tamp.devIt's a local proxy (npx @sliday/tamp) that sits between your coding agent (Claude Code, Aider, Cursor, Cline, etc.) and the upstream API. It compresses tool result blocks — JSON minification, TOON columnar encoding for arrays, line-number prefix stripping, whitespace normalization, and optional LLMLingua-2 neural compression — achieving ~52.6% fewer input tokens with zero behavior change.
Added Openclaw skill here: https://clawhub.ai/sliday/tamp