CharlieRuan
- Karma
- 3
- Created
- 2 years ago
Recent Submissions
- 1. ▲ In-browser LLM inference engine with WebGPU and OpenAI API (blog.mlc.ai)
- 2. ▲ Gemma locally on iOS, Android, web browsers, and GPUs with a single framework (old.reddit.com)
- 3. ▲ Running LLM (phi-2) locally on latest Google Chrome Android (twitter.com)