Settings

Theme

Show HN: Build apps with 500 models locally. No tracking, no cloud, just code

github.com

5 points by Gerome24 4 days ago · 2 comments · 2 min read

Reader

I built CodinIT because I wanted that "Bolt-like" experience, but on my own terms.

100% Open Source

The core idea: You should be able to prompt a full-stack application into existence, but the environment should be local, the models should be swappable (Ollama/LM Studio support was a priority), and the output should be standard code you actually own.

A few things I focused on:

Context Management: One of the hardest parts was figuring out how to feed the right file context back to the LLM without blowing out the token limit. I’ve implemented a custom indexing approach to keep the "vibe coding" flow snappy.

The "Local" Win: Since it runs locally, you can use it on a plane or in a coffee shop without burning through a data plan or worrying about API latency.

Stack: It’s primarily focused on the Node.js ecosystem right now, as that's where I found the most consistent generation results.

It’s definitely a work in progress. Specifically, I'm still fine-tuning how it handles large-scale refactors across multiple files, which is where most AI coders tend to trip up.

I’m really curious to hear from this community—do you actually prefer using local models for this kind of work, or is the "intelligence gap" between a local Llama 3 and Claude 4.5 Sonnet still too big for your daily use?

Site: https://codinit.dev

No comments yet.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection