Settings

Theme

Self-hosted AI data workflow: DB and Ollama and SQL

exasol.github.io

11 points by exasol_nerd 2 months ago · 3 comments

Reader

exasol_nerdOP 2 months ago

I wrote a tutorial for invoking Mistral 7B model directly with SQL using Python UDFs in Exasol and Ollama. This demonstrates a fully self-hosted AI pipeline where data never leaves your infrastructure—no API fees, no vendor lock-in. Takes ~15 minutes to set up with Docker.

  • pploug 2 months ago

    purely curious, but why did you go with ollama instead of the built in LLM runner in docker, since you are also using docker?

    • exasol_nerdOP 2 months ago

      great idea! I went with Ollama because I found set up to be slightly easier. But technically both should offer the same experience and altogether - hosting both in Docker is very logical. That will be the next iteration of my write up!

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection