Settings

Theme

Ask HN: Browser-Based LLM Models?

4 points by lulzury a year ago · 5 comments · 1 min read


Does anyone know if there are there any plans for browsers to natively integrate LLMs, LLM APIs, or LLM models like Llama for local use by web applications?

I feel there's a large opportunity here for a more privacy-friendly, on-device solution that doesn't send the user's data to OpenAI.

Is RAM the current main limitation?

throwaway888abc a year ago

https://simonwillison.net/2024/Jul/3/chrome-prompt-playgroun...

https://developer.chrome.com/docs/ai/built-in

  • lulzuryOP a year ago

    Thank you! This is exactly what I was looking for. I hope these become part of the web platform APIs! With Google pushing this effort, it might be highly likely.

    I hope Apple will follow suit with some of their small models (https://huggingface.co/apple/OpenELM).

    And then maybe even Firefox will join them...

throwaway425933 a year ago

Every big tech company is trying to do this. FB (through whatsapp), Google (through chrome/Android), Apple (through Safari/iOS/etc). As soon as they meet their internal metrics, they will release these to public

FrenchDevRemote a year ago

"Is RAM the current main limitation?"

(V)RAM+processing power+storage(I mean what kind of average user wants to clog half their hard drive for a subpar model that output 1 token a second?)

Crier1002 a year ago

check out https://github.com/mlc-ai/web-llm

IMO the main limitation is access to powerful GPUs for running models locally and the size of some models causing UX problems with cold starts

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection