Settings

Theme

Local LLM Setup on Windows with Ollama and LM Studio (ThinkPad / RTX A3000 GPU)

github.com

4 points by appsoftware a month ago · 3 comments

Reader

appsoftwareOP a month ago

This is a walkthrough of my set up of local LLM capability on a Lenovo ThinkPad P1 Gen 4 (with a RTX A3000 6GB VRAM) graphics card, using Ollamafor CLI and VS Code Copilot chat access, and LM Studio for a GUI option.

My Lenovo ThinkPad P1 Gen 4 is coming up for 4 years old. It is a powerful workstation, and has a good, but by no means state of the art GPU in the RTX A3000. My expectation is that many developers will have a PC capable of running local LLMs as I have set up here.

See the GitHub repository for the full walk through:

https://github.com/gbro3n/local-ai/blob/main/docs/local-llm-...

Ref: https://www.appsoftware.com/blog/local-llm-setup-on-windows-...

akssassin907 a month ago

By Moore's Law a 4-year-old machine should be a quarter of what's available today, it should be struggling. Instead it's rocking. Either Moore's Law is stalling, or software efficiency is finally catching up. Either way, the "you need new hardware" argument is getting weaker everyday. Long live tired silicon!

  • appsoftwareOP a month ago

    OLLAMA_FLASH_ATTENTION is essential for getting reasonable performance. Yes I did well with that laptop, it's been a trusty steed :) a refurb too.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection