Open-source proxy that lets the Claude Code CLI run on Databricks Model Serving
It emulates the Claude Code backend locally and adds: – repo indexing + CLAUDE.md summaries – symbol search + cross-file references – git automation (diffs, commit, push policies) – MCP server orchestration (JSON-RPC tools) – prompt caching layer – optional Docker sandboxing – workspace tools for tests, tasks, diffs, file ops
Useful if you're doing LLM/AI engineering in environments where you need: – Databricks as your LLM runtime – private repo access – custom agent tools – auditability and local control
Repo: https://github.com/vishalveerareddy123/Lynkr Docs: https://vishalveerareddy123.github.io/Lynkr
Would love feedback, issues, or feature ideas from folks experimenting with LLM developer workflows. Claude Code is surprisingly strict about response shape, tool timing, and streaming semantics. Even small differences in Azure/Databricks responses cause silent failures. Lynkr sits in the middle and enforces “Anthropic-ness” as a contract: request rewriting tool call lifecycle enforcement streaming normalization capability emulation when the backend can’t support something natively Happy to explain any of the protocol quirks if people are curious.