Settings

Theme

Ask HN: Distributing a Local LLM with an Application

1 points by FezzikTheGiant 2 years ago · 0 comments · 1 min read


Since there are some models that can run on most consumer hardware (eg. the phi models) - is there a way to distribute LLMs packaged with whatever app. you're building? So the users don't need to download anything extra - they just download your app and you package your local model with it, which uses their hardware to run. This might be a naive question, still pretty new to this field.

No comments yet.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection