Settings

Theme

LLM Proxy for Agent Containers

github.com

3 points by kalib_tweli 4 days ago · 1 comment

Reader

kalib_tweliOP 4 days ago

LLM proxy for containerized AI agents. The daemon proxies LLM API calls and remote MCP tool calls through a unix socket. The runtime drives the agent loop inside the container. Credentials never cross the socket boundary.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection