Settings

Theme

Running DeepSeek R1 on Your Own (cheap) Hardware – The fast and easy way

linux-howto.org

20 points by BimJeam a year ago · 19 comments

Reader

cwizou a year ago

Maybe you should add "distills" to the title? As this is about installing Ollama to grab the 7b or 14b R1-Qwen-distills, not "R1".

  • karmakaze a year ago

    "The fast and easy way" is also being oversold.

    > Why Ollama? Because it makes running large language models actually easy.

    > If it doesn’t work, fix your system. That’s not my problem.

  • nkozyra a year ago

    Right, and fundamentally no different than running any other ollama model that can run reasonably on your local machine.

  • BimJeamOP a year ago

    OK I understand now and will fix that title. Sorry for that inconvenience. My bad. :-/

ghostie_plz a year ago

> Unless you like unnecessary risks. In that case, go ahead, genius.

what an off-putting start

Euphorbium a year ago

I have R1:1.5B running on my 8gb ram M4 mac mini. Dont know where I would use it, as it is too weak to solve actual problems, but it does run.

BimJeamOP a year ago

Set up a local AI with DeepSeek R1 on a dedicated Linux machine using Ollama—no cloud, no subscriptions, just raw AI power at your fingertips.

BimJeamOP a year ago

Sorry if you guys get so overwhelmed with deepseek submissions these days. This will be my one and only in the next time. It is cool to have an anti-weight to all these pay models.

  • ai-christianson a year ago

    Personally I don't get sick of it. There's a lot of hype around Deepseek specifically rn, but to run SOTA or near SOTA models locally is a huge deal, even if it's slow.

    • danielbln a year ago

      The issue is that this article is conflating (as do many, many articles about the topic) the distilled versions of R1 (basically llama/qwen reasoning finetunes) with the real thing. We are not even talking about quantized versions of R1 here, so it's not quite accurate to say you're running R1 here.

assimpleaspossi a year ago

Are there any security concerns over DeepSeek as there are over TikTok?

Saw this in the article

>I would not recommend running this on your main system. Unless you like unnecessary risks.

  • croes a year ago

    The model itself can’t do anything bad despite giving false answers or block them.

    Using hosted versions where host collects data or using a unknown software that runs the model is the risk.

donclark a year ago

I like this. However, I did not find any minimum specs or speed. Maybe I missed? Can some point me in the right direction please?

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection