Settings

Theme

Show HN: Docgen – A C++ AI CLI to solve documentation hell with local LLMs

2 points by alonsovm 6 days ago · 0 comments · 2 min read


Hi HN,

I’m a solo dev who got tired of the "documentation hell", either spending hours writing docs that immediately become outdated, or having no docs at all. I wanted a tool that treats documentation generation as a standard build step, so I built Docgen.

Docgen is a lightweight AI CLI tool written in C++ that automates docs-as-code. It sits in your repo (via a .docgen folder and a Docfile) and generates Markdown files next to your source.

A few technical details on how it works under the hood:

- Local-First & Private: It defaults to using Ollama locally so your proprietary code never leaves your machine (though it supports cloud APIs like OpenAI/Gemini if you prefer).

- Smart Incremental Builds: It uses content hashing. When you run docgen update, it only regenerates docs for files that actually changed, saving massive amounts of API credits and compute time.

- Context-Aware (RAG): It automatically analyzes #include dependencies to give the LLM the right context, rather than just blind-feeding it a single file.

- Zero Dependencies: Compiled as a single static binary. Just download and run.

- The "Auto" Mode: This is my favorite part. If you run "docgen auto", it acts as a file watcher with a built-in debounce (waits a few seconds after you stop typing/saving). It quietly updates your Markdown docs in the background while you stay in your flow state.

I’m currently focused on improving the RAG context handling.

You can check it out here: https://github.com/alonsovm44/docgen

I'd love to hear your thoughts, critique on the architecture, or any edge cases you think I should handle!

No comments yet.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection