Settings

Theme

OMLX – LLM inference, optimized for your Mac

omlx.ai

2 points by wrxd 14 days ago · 1 comment

Reader

threecheese 14 days ago

It would be helpful to benchmark against other providers that sit atop MLX; this page tells me how OMLX does, but not why I should move from another (like LMStudio etc). I get that you have some features that you might only find in vllm, but how do I know that Ollama would be X tps slower? TBH not seeing competitors in a benchmark makes it less a benchmark and more a data sheet.

https://omlx.ai/benchmarks

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection