With Nvidia's GB10 Superchip, I’m Running Serious AI Models in My Living Room. You Can, Too

13 min read Original article ↗

AI dominates headlines, product launches, and the markets nowadays. But I've been always been fascinated by one crucial aspect of it that doesn't get much public attention: how it works behind the scenes. As a software engineer, I’ve yearned to move beyond using AI and actually building with it, which until recently required hardware well beyond anything like a hobbyist's budget.

That all changed late last year.

With the debut of a host of mini-desktops based on Nvidia's Grace Blackwell "superchip" ("superchip" is Nvidia's own, but appropriate, term), the GB10, the barrier to entry has lowered significantly. GB10 is a boiled-down version of the same Grace CPU and Blackwell GPU silicon running on racks in data centers, with the same software stack and unified memory design. The usual slate of PC OEMs, among them Acer, Dell, HP, and Gigabyte, has rolled out an initial army of GB10-based mini systems designed for AI model work. Most are around $4,000, and with them, AI development is no longer bound to computer labs. It's a reality sitting on my desk.

I got my hands on an early GB10 box, Dell’s Pro Max mini workstation, which starts at a little over $4,000. Dell and Nvidia have teamed up to help kick off my AI development journey, providing me with the Pro Max unit and a brief introduction to the hardware and software so I can share what it's like to use an AI supercomputer in your home. Over a series of articles, I'll take you from unboxing the Pro Max GB10 to firing up my first AI image generator in Linux and the many discoveries in between and after.

So far, it's been an exciting and engaging trip: My first interactions with the platform have only left me wanting more time with it to develop more complex projects, and they're coming. First, here's what the Dell Pro Max and Nvidia GB10 are all about, and what it's like to open and use a GB10 AI-development box for the first time.


A Closer Look: Nvidia’s Grace Blackwell GB10 Superchip

The GB10 pairs an Nvidia "Blackwell"-architecture GPU with an Nvidia "Grace" CPU, a 20-core Arm-based chip. On top of that potent combo is 128GB of LPDDR5X unified memory on tap. That giant dollop of RAM gives the GB10 enough overhead to run 200-billion-parameter models. For context: Considering that Nvidia's flagship consumer graphics card, the GeForce RTX 5090, has “only” 32GB of RAM, this is seriously capable hardware at a cost similar to a middle-of-the-road desktop workstation.

Don't get any ideas, gamers: The GPU inside the GB10 isn’t as powerful at graphics rendering as what you’ll get in a top-level graphics card. The GB10's secret weapon is its extra memory, which is invaluable for running medium- and large-scale local AI models that typically won’t fit in even the RTX 5090’s otherwise generous buffer. (Also, the GB10 boxes run a low-impact version of the Linux operating system, which will help minimize performance hits.) In addition, the GB10 is exceptionally power-efficient: Dell's GB10 Pro Max desktop uses a 280-watt laptop-style power adapter, rather than a conventional desktop power supply.

The various GB10 boxes look different, but share much of their internals. Dell, like the other OEMs, built its GB10-based Pro Max using Nvidia’s reference design and its GB10 hardware. Nvidia also sells its own "classic" reference-design version of a GB10 system, the Nvidia DGX Spark (pictured below; we're also test-driving one of those).

Nvidia DGX Spark

(Credit: Joseph Maldonado)

Dell’s version is unique because of its case, cooling system, and port layout. Devices from the other Nvidia partners I mentioned earlier follow the same path: individualized chassis designs and thermal gear, but much the same internals.

Dell Pro Max With GB10

Dell Pro Max With GB10


Meet the Dell Pro Max With GB10: Unboxed and Running in No Time

One of the biggest hurdles in AI adoption isn’t hardware: It’s the time it takes to get from purchase to productivity. Dell’s Pro Max With GB10 practically eliminates that hurdle. In less than an hour, I had the system unboxed, connected to Wi-Fi, and running my first Nvidia lab. If you’ve shied away from experimenting with AI development because of the startup time, this might help you reconsider.

Dell Pro Max With GB10

(Credit: Charles Jefferies)

Before getting into the setup, let’s focus on the machine's compact design. Measuring 2 inches high by 5.9 inches square and weighing 2.9 pounds, the Pro Max With GB10 truly can be held in one hand. (I've included some of my Lego friends for scale.) Its metal chassis feels reassuringly solid, and the honeycomb front grate makes it look like a miniature blade server. Connectivity includes Wi-Fi 7 and Bluetooth 5.4. My unit includes a 4TB SSD, roughly a $695 upgrade over the standard 1TB drive in Dell's base model, which was $4,061 when I wrote this. Beyond the storage choice (1TB, 2TB, or 4TB), Dell provides no other hardware configuration options. Everything else was set specifically for its intended purpose.

Dell Pro Max With GB10

(Credit: Charles Jefferies)

All the physical ports sit along the rear shelf: USB‑C power, three USB‑C 3.2 Gen 2 ports (each supporting DisplayPort output), HDMI 2.1b, and 10Gbps Ethernet. On the far right, visible below, are dual 200Gbps ConnectX‑7 SmartNIC ports.

Dell Pro Max With GB10

(Credit: Charles Jefferies)

If those last ports look unfamiliar (they very well may, unless you know telecom or server network connectivity), these use Quad Small Form-factor Pluggable (QSFP) cables to connect multiple Pro Max GB10 units together. At the moment, you can pair two systems to address a combined 256GB of unified memory, with support for larger clusters expected down the line.

You can access the Pro Max either wired or wirelessly. A wired setup is straightforward: Connect a monitor and peripherals, and treat it like any other mini desktop. But the wireless path is where the device shines brightest. Out of the box, it broadcasts its own Wi‑Fi hotspot, with the SSID, password, and setup URL printed on a sticker. I connected my Lenovo ThinkPad laptop, opened the URL, created a local account (no cloud login required), applied a few updates, and was ready to go. The entire process is seamless.

Dell Pro Max desktop running Nvidia GB10 AI hardware

(Credit: Dell/Nvidia)

With setup complete, the only thing left was to start putting the Pro Max to work through my development tools.


As a lifelong Windows user, I’ve dabbled in Linux only a few times before the Pro Max With GB10. This system runs a customized version of Ubuntu, one of the more popular Linux distros. Basic tasks like web surfing and navigating the file system are intuitive.

However, you do not actually need to use Ubuntu on the Pro Max directly, since all its resources are accessible over a local network from a computer of your choice, thanks to an app called Nvidia Sync. On my Windows 11 PC, the app lives in the system tray and lets me easily connect to the Pro Max from there.

Dell Pro Max desktop running Nvidia GB10 AI hardware

(Credit: Dell/Nvidia)

Opening the DGX Dashboard provides basic information and controls for JupyterLab, a web-based development environment. You can also monitor system resource usage and initiate system updates. Plus, multiple users can connect to the Pro Max at once, greatly enhancing its versatility, versus running things on your local PC.

Now that I had the Dell Pro Max up and running, including the option to use it remotely, the only thing left to do was to pick an Nvidia Playbook to begin familiarizing myself with the GB10’s possibilities.


Getting Started With AI Development: First, Do It by the Playbook

I must give Nvidia credit: Its Playbooks are excellent on-ramps to practical AI work. Even with a software-engineering background, I am new to many of the concepts, tools, and platforms used in the AI world. Completely free and backed by excellent user forums for support, the Playbooks gradually introduce you to the GB10’s possibilities, step by step. When I hit a snag—which I certainly have, a few times—I can use a combination of those forums, YouTube, and Copilot to get going again quickly.

Newcomers, particularly those without an AI background, can jump into almost any Playbook. I would suggest, though, sticking to one where the prerequisite knowledge feels at least somewhat familiar. So far, I have found Nvidia's estimated completion time for each Playbook accurate, though I occasionally pause to familiarize myself with a concept. (Let's just say, as time went on, YouTube quickly dominated my browsing history.)

Newsletter Icon

Newsletter Icon

Get Our Best Stories!

Your Daily Dose of Our Top Tech News

What's New Now Newsletter Image

Thanks for signing up!

Your subscription has been confirmed. Keep an eye on your inbox!

My first real project on the Pro Max With GB10 was a Python script in Visual Studio Code (VS Code) that leveraged Nvidia cuPyNumeric, which I discovered following Nvidia’s Developer YouTube channel. The tutorial exposed me to installing Linux packages, creating a Python developer environment, and running scripts on the Pro Max With GB10 through VS Code. This tutorial reinforces that the GB10 works seamlessly over a network, and ever since I set it up that way, I've seldom hooked up a monitor to the Pro Max—I simply power it on and access it from the Nvidia Sync app on my ThinkPad.

Another early lesson: The Linux terminal is central to AI development. Installing packages, running commands, and managing environments all happen there. Fortunately, accessing it in VS Code is trivial: typing View: Show Terminal in the command palette instantly drops me into a shell.

Dell Pro Max desktop running Nvidia GB10 AI hardware

(Credit: Dell/Nvidia)

The best part is that I rarely have to write commands from scratch. The Playbooks and tutorials provide nearly everything you need, and Copilot fills in the gaps when I'm not sure what to type. With my confidence growing, I'm ready to take on a more challenging Playbook.


A First Project: Setting Up an AI Image Generator With ComfyUI

Image generation from a written prompt is one of the most popular and accessible applications of consumer-facing AI. Many services online do that kind of thing for free, but they almost always impose limits on resolution, usage, or how many images you can create. Running your own generator locally removes all of those constraints and adds the perk of data privacy, since nothing gets shared. That's why the 45‑minute ComfyUI Playbook immediately caught my attention. In AI circles, ComfyUI is a well-known, open-source user interface for building image-generation workflows.

After completing the cuPyNumeric tutorial, I already have the essentials down: Python virtual environments, terminal commands, and the general workflow of developing on the Pro Max GB10. The Playbook also suggests developing or acquiring skills in areas like container deployment and deep-learning model setup, where I'm far less experienced. But it doesn't matter. Between YouTube and Copilot, I can bridge any gaps along the way.

Dell Pro Max desktop running Nvidia GB10 AI hardware

(Credit: Dell/Nvidia)

With all the installation steps complete, I can access ComfyUI in my web browser and do everything from there. 

Recommended by Our Editors

Dell Pro Max desktop running Nvidia GB10 AI hardware

(Credit: Dell/Nvidia/ComfyUI)

ComfyUI includes templates for just about any workflow you can imagine. I started with Z-Image-Turbo Text to Image, which immediately warned me that I was missing models. I downloaded them locally and, with the help of Copilot, used a simple PowerShell SCP command to copy everything to the correct directories on the Pro Max.

Dell Pro Max desktop running Nvidia GB10 AI hardware

(Credit: Dell/Nvidia)

Once again, I rarely had to write or modify commands myself. Between the Playbook and Copilot, nearly every step was handled for me.

Now, for the prompting. I quickly discovered that prompt writing is its own craft. More detail typically produces better results, but contradictions can produce unusable results. To test things out, I tried to replicate PCMag’s logo with the following prompt:

“A big red square against a white background with the letters 'PC' in the box along the top in large bold font and 'MAG' in smaller font on the bottom in smaller, non-bold font. The letters must be white, and the box must be filled in red. The box must have square corners. Use Arial Narrow for the font.”

The DGX Dashboard reported roughly 26GB of memory in use. That really frames the value of the Pro Max With GB10, since only the highest-end consumer graphics cards would have enough local memory to fit this model. The result was serviceable and on point...

Dell Pro Max desktop running Nvidia GB10 AI hardware

(Credit: Dell; Nvidia; Charles Jefferies)

Another lesson: The same prompt can yield noticeably different results if you run it several times. A few of my logo attempts had color bleed or distorted text, and when that happened, I simply ran it again. Iteration is part of the process.

I also tried generating something more artistic, using a nature prompt. The images below are variations from hitting the Run button a few times with the same description—a testament to the power of no usage limits!

“A fall sunset scene in the Pacific Northwest, with snow-capped mountains in the background and a birch grove and a stream in the foreground.”

Dell Pro Max desktop running Nvidia GB10 AI hardware

(Credit: Dell/Nvidia)

Dell Pro Max desktop running Nvidia GB10 AI hardware

(Credit: Dell/Nvidia)

These 1,024-by-1,024-pixel images took fewer than 10 seconds each to complete, making them ideal for rapid experimentation. Once I had the prompt dialed in, I pushed the resolution to 4K (3,840 by 2,400 pixels) and really put the Pro Max with GB10 to work: memory usage climbed to about 31GB, and render times stretched to about two minutes. The system produces a fair amount of heat under a load like this, though its fans remain impressively quiet.


First Takeaway: This Is AI Development's Viral Moment

Dell’s Pro Max With GB10, and the GB10 systems like it, represent a meaningful shift in who can participate in AI development. Nvidia’s Grace Blackwell GB10 chip pairs the company’s latest GPU technology with 128GB of memory for capabilities that outpace even the most powerful consumer GPUs.

The hardware is nothing to sneeze at, of course. But it's the ecosystem around the Pro Max and other GB10 devices that ultimately makes this platform so desirable. The streamlined setup, clear documentation, and Nvidia’s excellent Playbooks remove many barriers that once made AI dev work feel inaccessible or opaque. In under an hour, I had the device unboxed and connected to my network, and was working through real development exercises.

While $4,000 (or a bit more, depending on the SSD capacity you choose) doesn’t make the Pro Max With GB10 an impulse buy, it is within reach for well-funded hobbyists, independent developers, and small teams of researchers or experimenters who previously had no practical path to this kind of capability. That alone makes it a milestone in AI development, a potential viral moment where more people become AI developers at once than ever.

This is just the beginning of my journey in learning AI development with the Dell Pro Max and GB10. I'll explore the platform further in upcoming articles, including multi‑node configurations and comparisons with Nvidia’s own DGX Spark system. For now, the Pro Max With GB10 has me genuinely impressed and eager to see just how far this compact powerhouse can go when two units are linked together and join forces.

About Our Expert