Settings

Theme

Show HN: Vibe Coded my own chat app with Meta Llama-Stack, and MCP integrations

github.com

1 points by r2ob 8 months ago · 0 comments · 1 min read

Reader

I needed a quick way to experiment with two very different kinds of tools at once: custom Python functions living on my laptop and MCP-server tools exposed remotely through Supergateway. Instead of juggling multiple demos, I spun up my own chat app with Gradio, wired it to Meta’s Llama-Stack, and made everything talk to the same agent. With this setup I can drop a new local tool in a folder, point to a hosted endpoint, and see both fire in the same conversation stream—no heavy front-end work, just an instant playground for prompt tweaks, tool-calling logic, and Supergateway configs. Building the app scratched my itch for fast iteration and keeps all my LLM experiments under one roof.

No comments yet.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection