Ask HN: Distributing a Local LLM with an Application
Since there are some models that can run on most consumer hardware (eg. the phi models) - is there a way to distribute LLMs packaged with whatever app. you're building? So the users don't need to download anything extra - they just download your app and you package your local model with it, which uses their hardware to run. This might be a naive question, still pretty new to this field.
No comments yet.