Settings

Theme

Is it ok to use GPUs like the A4000 or RTX 3060 as inference servers?

1 points by kouohhashi 2 years ago · 1 comment · 1 min read


When it comes to inference using cloud services, Tesla T4 GPUs are often used, but they are not cheap. It seems that creating a server room with A4000 workstations or RTX 3060 notes might be more cost-effective, but this could potentially violate Nvidia's terms and conditions.

On the other hand, there are cloud services that advertise that using the A4000 for inference is acceptable. Does this mean that while support from Nvidia might not be available, it is implicitly tolerated by Nvidia?

migf 2 years ago

This is giving me such Beowulf Cluster deja vu

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection