Settings

Theme

Ask HN: Where to Host Llama 2?

3 points by retrovrv 2 years ago · 2 comments · 1 min read

Reader

There's ollama/ggml etc for local setup, but other than Replicate, what are the other options for hosting Llama 2?

brucethemoose2 2 years ago

vast.ai is a popular and economical option.

A single 3090 will host 70B reasonably well, two will fit it completely in vram.

Another thing I suggest is hosting on AI horde with koboldcpp, if the UI/API works for you and the finetune is appropriate for public use. You will get priority access to your host, but fulfilling other prompts in its spare time will earn you kudos to try other models people are hosting, or to get more burst throughput.

https://lite.koboldai.net/#

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection