Cannoli allows you to build and run no-code LLM scripts in Obsidian
github.comI'm having fun with this visual editor for LLM scripts. It's almost like Hypercard for LLMs.
On my 16GB MacBook Air, I did not have to set the OLLAMA_ORIGINS env variable. Maybe I did that a long time ago, as I have a previous Ollama install. This is the first really fun toy/tool that I've found that uses local (also supports foundation model APIs) LLMs to do something interesting.
I'm having a ball!