Settings

Theme

Show HN: AI Chat Adapter

github.com

1 points by kirth_gersen a year ago · 0 comments · 1 min read

Reader

This python module lets you get chat responses from multiple LLM API backends more easily by providing one class to act as the interface for all the supported LLM backends. It can handle both local and remote LLMs. So far it supports OpenAI, Anthropic for remote APIs and Ollama and LMStudio for local LLMs. I made this to experiment with having multiple LLMs chat with each other, but realized it might be more generally useful to others as well. For example, this would make it trivial to switch between API providers for a critical service during provider outages or to do comparison testing with multiple LLMs. There is still a lot more I want to do: Add support for Groq and others APIs, add support for chat streams, improve tool call support and more. Your feedback is welcome and if you can think of more cool use cases I'll add them to the README.

No comments yet.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection