Settings

Theme

I Have a Stockpile of Servers and GPUs

2 points by BracketMaster 2 years ago · 3 comments · 1 min read


I have a stockpile of 22 8 GPUs servers with AMD Mi50s(see notes about Mi50s below). I've been able to get PyTorch working on these GPUs and have been able to do inference for different large language models. I originally wanted to use these GPUs to serve up LLMs, but VLLM cuda kernels don't work out of the box with the Mi50s, and Llama CPP has a bug where it only supports up to 4 AMD GPUs at once.

So TLDR, I don't want these servers sitting around and if anybody has any creative useful ideas for the servers, I'm happy to grant them SSH access to piddle around.

Mi50 Specs: - 16GB VRAM - 1TB/s VRAM BW - 25 TFLOPs

shortrounddev2 2 years ago

Consider contacting a university to donate your server time to for medical research

  • BracketMasterOP 2 years ago

    Sounds like a viable path. Do you have any university or contacts in mind? Perhaps there is a forum where I can post about this?

    • shortrounddev2 2 years ago

      Sadly I dropped out of college and have no relationships with my alma mater (Virginia Commonwealth University). However, they are historically known (regionally, and somewhat nationally) as a medical school and have translated that into a bioinformatics program which produces research (no idea where it ranks in the grand scheme of things), so perhaps you could look there (or search for other universities with bioinformatics programs!)

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection