Show HN: Namefi built WebGPU-powered in-browser LLM pure client-side infer UX
search.labs.namefi.ioWe've built a domain brainstorm experience with WebGPU-powered in-browser LLM pure client-side inference, seems working pretty well. Not on par with ChatGPT/Gemini for sure, but good enough for this use case.
A few benefits
1. Running on pure client side, privacy friendly 2. Free - no credits needed, no API keys, nothing. User device pays the computation 3. Interoperable - you can let it come up with direction of ideas.
QQ: What are a few ideas you think it could be used for? QQ: If on-device LLM becomes much more powerful, what are things it can do before they are less powerful?
No comments yet.