How to run LLMs locally on mobile devices (with Gemma and On-Device AI tools)
annjose.comAny models that can run on a mobile device will likely be 8B or smaller, will have very noticable hallucination problems.
Any models that can run on a mobile device will likely be 8B or smaller, will have very noticable hallucination problems.