Show HN: Agent Firewall – Go proxy to kill LLM death spirals
I run 6 AI agents as my entire team. Yesterday two agents got stuck in an infinite loop arguing over JSON formatting. Burned $47 in API calls while I slept.
Anyone running multi-agent setups (CrewAI, AutoGen, LangGraph) knows the pain: agents go rogue, tokens burn, no circuit breaker.
Building an open-source Go reverse proxy. Change one env var (OPENAI_BASE_URL=http://localhost:8080/v1), it detects death spirals and physically cuts the connection.
Local-only, no SDK, no cloud, single binary. Ship in 2 weeks if enough people need it.
Have you been burned by runaway agents?
No comments yet.