Settings

Theme

Show HN: Local AI stack (Docker, Ollama) that lets you build apps without Python

github.com

1 points by aeberhart 2 months ago · 0 comments · 1 min read

Reader

We built a local-first AI stack (Docker + Ollama) that lets you build LLM tools and workflows without writing Python. It supports:

* Multimodal chat * Retrieval Augmented Generation (incl. automated & scheduled doc import) * MCP tool support (web search, file access, O365) * Custom tools implemented using JSONata & SQL

Most local AI tools are chat UIs. We wanted something closer to a programmable AI platform that still runs locally.

* leverage a low code platform to insert programmatic hooks in the chat * provide custom tools written in SQL or JSONata * embed LLMs in workflows and custom UIs * give fine grained role based access control to AI apps (who can see what and which tools is the LLM allowed to use)

To summarize, it aims at being as flexible as custom Python code while being as easy to use as Open WebUI.

No comments yet.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection