So You Want to Work in Hardware

14 min read Original article ↗

Are you a software engineer undergoing an existential crisis every time you open social media? Do you fear that Claude Code has made you irrelevant? Are you idly entertaining previously unthinkable alternatives like the trades? Did you study software believing it was the most reliable path to a middle-class life in America, only to be faced with high new-grad unemployment rates, management directives for developer “efficiency”, and AI doom-mongering? You might be thinking about career pivots.

The last few months have been demoralizing for software engineers. At this point LLMs can pump out decent Python code, and agents can compile it, test it, and deploy it in an autonomous loop. A good engineer can still do it better, but plenty of executives who have never heard the phrase “technical debt” believe they can plunge ahead and reduce the number of highly-paid developers on their staff. And there are many, many highly-paid developers in the US, and many more clamoring to join the most successful sector of the century.

There has likely been an oversupply of programmers since the 2010s, the glory days of Learn to Code. As 2016 nostalgia memes remind us, you could once catapult yourself into a six-figure job with nothing more than a few months of prep work on Leetcode. No college degree or strong handshake required. It all reached a peak in 2020, when big software companies went on a zero-interest-rate-induced hiring frenzy, and things have gone downhill from there.

What to do?

Maybe you could just “learn AI!” This could either mean you learn how to use AI, or learn how to develop AI.

On the AI user side, you mostly benefit if you already have senior-level understanding of a system and deep familiarity with its design choices. These people really do gain a lot from agentic coding systems. If you find yourself struggling to stand out among a sea of juniors, your awesome prompting skills are unlikely to distinguish you.

Meanwhile, becoming an AI developer requires postgraduate degrees, an insane work ethic, and a decade of experience in the dark art of training models. This is the hottest frontier in scientific research today, and as we sit here wondering how to fluff up our resume, a prestigious Stanford lab has already graduated another psychotically focused reinforcement learning expert grinding for their next lottery ticket at Anthropic. Those fancy jobs building AI models get pro sports-level pay packages from Mark Zuckerberg because they require pro sports-level dedication and experience from the most insane strivers in America.

What else is there besides software? The other default professional careers–law, medicine, finance–already groan with overcapacity. Most long ago started filtering out the potential pool with postgraduate degrees. No 6-week bootcamps for oncology, sadly. Besides, Claude Code will surely shrink the number of workers in those fields too.

What about something AI can’t do? Something physical!

Maybe, while doomscrolling AI doom articles, you have looked down at your disappointingly human fingers and realized they are holding a physical piece of hardware. That could be an option. At least you know something about computers already. It’s certainly more appealing than The Trades. You, a software engineer, can’t seriously think you are going to become an electrician. Just consider the culture fit if you became a tradesman–actually, don’t use words like “culture fit” in the trades. Don’t join the trades. Consider hardware. Hardware engineers probably won’t mock you in Spanish for showing up to a work site in a puffer jacket and Allbirds.

The tech drones of Office Space found meaning on a construction site

Many macroeconomic arrows seem to point towards hardware as well. For one, hardware engineering cannot be replaced by AI. We will perfect robots that can lift crates long before we invent one that can debug a busted oscillator on a PCB. Even the parts of the job that can theoretically be automated (PCB layout, digital circuit design, embedded programming) relies on long, slow cycles involving prototyping, laborious real-world testing, manufacturing, and reliability. Claude can instantly test its codeslop and correct errors with the help of compilers and interpreters. The same is not true for a vibe-designed circuit board.

Not only that, but the closer you get to hardware, the worse the automation tools get. Claude Code works quite well for those scripting tools, but it fails catastrophically when I try to use it to help with firmware. Documentation remains spotty and difficult to scrape for online bots. Most hardware and firmware knowledge either lives on decades-old forums that you have to unearth, or in the gray-haired heads of principal engineers. The FOSS movement never made much of a dent in firmware, meaning a huge amount of this code is proprietary and not available to train an AI. Circuit designs are even less publicly available. For now, the amount of effort and money required to just find the appropriate data to train an AI model in hardware or firmware design remains extremely daunting.

Geopolitically, too, America seems to have woken up from the decades-long delusion that All You Need is Software. Old-fashioned concepts like great power politics, war, and industrial capacity have returned in a big way in the 2020s. If software companies thrive on globalization, hardware companies thrive on de-globalization. While the public AI boosters keep bleating that the real race with China is for AGI, governments, investment banks, and venture capitalists have begun throwing investment into robots, drones, and other Hard Tech. And even the AI dreamers are finding that real-world constraints like RAM, energy, and cooling capacity constrain AI as much as model performance does.

It isn’t clear whether we can really reshore industry, or make the US equal China in its industrial might. But the pendulum has clearly begun to swing back.

If you are considering hardware as a career, I can give you highly cynical and highly accurate advice. I have not yet started a scammy career consulting business, so I will give my thoughts for free.

I have worked in hardware in the Bay Area for about 8 years now. I have worked on radar sensors, self-driving cars, brain stimulation systems, and surgical robots. I graduated from one of America’s top engineering schools with really good grades. Consider this the equivalent of basketball advice from a Division I college player who rode the bench all 4 years, started in a few midseason games when a teammate had mono, and once made a pass to the star player for a game-winning shot over Chico State.

I have also spent most of my career being outearned and outshined in Silicon Valley by the Software Guys. I graduated in the heart of the Learn to Code era. In the 2010s you could get a job out of college at a FAANG company doing software engineering and immediately earn $200k. Even boot camp graduates could find lucrative remote jobs after only a few months of coding experience.

Of course the software boom predates the 2010s. The topic of the day varied: as a child in the 2000s, the hot thing was web development. Then it shifted into mobile app development in the early 2010s, then it was data science in the mid-2010s. But the common factor was always software.

If you were a little more far-seeing, you might have started working on AI back in the 2010s, or rather “machine learning,” as it was called back then when AI was still more of a sci-fi term that sober businesses avoided. If I was really smart, I probably would have done that. My friends from that time who got into AI labs and stuck with it have continued to out-earn me, and now receive the most intense spotlight ever given to any tech industry since ChatGPT dropped.

Instead I studied hardware. It was not a great idea at the time. I did it mostly out of contrarianism, a general intuition that I shouldn’t do what everyone else is doing. I also had a vague sense of mens et manus, mind and hand. Coders can rebrand themselves as “builders,” but humans with tangible bodies respect those who build tangible things, things you can hold and touch and examine like the apes in 2001. I had a somewhat macho sense at a young age that I should be building something real, not a B2B SaaS company! Of course B2B SaaS rewards you with money, while hardware rewards you with lots of time on an oscilloscope.

But now the patron saints of B2B SaaS are announcing that it is “time to build.”

I should be gloating. A shift to hardware should mean a philosophical victory, a win for reality over illusion, materials over vagueness.

I have adopted a wait-and-see attitude instead. Crypto conmen and AI grifters still run the economy–they’ll figure out some way to create FTX for batteries. I don’t foresee a Socialist Realism future of musclebound Rosie the Riveters waving American flags as they build armies of robots. In this America, illusion will always triumph over substance.

Still, I can’t deny that interest in hardware and “deep tech” has skyrocketed recently, and that the field looks like a safe haven for white-collar workers who might be replaced by agent swarms.

So you want to work in hardware. What do you need to know?

First of all, it’s hard. Obviously. Everyone says this. The very things that make it difficult to automate make it difficult to do. The real world is far less deterministic than a computer processor. Although software tends to be buggy and unreliable, an actual computer chip can execute billions of instructions per second without ever making a mistake, making them among the most reliable devices ever invented. If your code fails, you can be quite sure that the problem is not the machine executing your code. This is not true for hardware. Easily overlooked details like the length of a screw or the width of a metal plate can cause catastrophic errors. Even harder are errors that come about through the unfortunate stacking of real-world uncertainty. Strange, difficult-to-reproduce Heisenbugs can happen when manufacturing tolerances combine with environmental conditions like heat or water, which combine with user error, which combine with unusual software conditions.

Second, hardware is slow. Products take a long time to design, test, and manufacture. You can’t easily modify a physical product in the way you can fix a line of code. Those slick Agile workflows your boss posts about on LinkedIn, the sprints with new features every week? They don’t work with hardware lifecycles, especially if you build hardware for a regulated industry like automotive, medical, or defense. You have to go back to good old waterfall design.

Hardware also moves slowly in an intellectual sense. Many of the most widely used microprocessors today date back to the 1990s, and fundamental ideas in embedded programming date back to the 1970s. While software engineers feel the need to learn a new framework every year to stay up to date, the advantages in hardware mostly come with experience with the same small set of knowledge. Juniors may not be replaced by AI, but they still face a profound learning curve. Young engineers cannot jump the line by learning a slick new programming language, no matter what the Rust guys claim. By entering hardware, you enter the turf of the graybeard. The rules are different.

Third, the pay is poor. You shouldn’t expect the good old days of the 2010s to return with “edge AI” replacing “B2B SaaS” as the buzzword du jour. The high wages of that era were a historical anomaly.

A friend asked me once why hardware pays so poorly (relative to software) when the skills are in high demand and, at least in the US, somewhat rare. The answer is that hardware simply does not provide the same kind of economics as software does. Peter Thiel, a man who seems to yearn for more substantive Hard Tech developments, also outlined why Hard Tech is a worse business than software in his book Zero To One.

Software requires no cost to reproduce–once it is written, it can be copied indefinitely. The main economy of scale that software benefits from is the network effect of products like Facebook and TikTok. But in general, software margins are very high. Google Search, the greatest product in history, enjoys a profit margin between 75% and 87%. Hardware obeys the same rules of margins that other industries like fashion obey. You get the highest margins with luxury goods and with frontier products. As time goes by, competitors innovate and prices go down, so margins tend to go down too. Best case scenario for profit would be something like ASML, which holds a monopoly on extremely expensive, precisely-engineered semiconductor manufacturing equipment. You won’t get much higher than ASML’s 52% gross margin.

But monopolies in hardware are rare. In software, they are often the rule thanks to those same network effects (and license monopolies, the secret sauce behind the worst software company in the world, Oracle). They are very difficult to maintain in hardware, even if you start with a huge leading advantage. Tesla, for instance, got almost a decade of monopoly over the luxury EV market, but now finds Chinese competitors like BYD cutting into their market share. Nvidia has become the largest company in the world based on its hardware monopoly, but ASICs from competitors have already started cutting into their lead. (Much of Nvidia’s monopoly also comes from its software advantage in CUDA, which is why capable GPUs from competitors like AMD have failed to gain traction.) Even Apple, the most successful hardware company in history, enjoys far from a monopoly in smartphones, especially looking at worldwide figures.

So hardware companies will not equal the kind of profits that Google and Facebook and Microsoft rake in. The average pay that a bootcamp graduate in the 2010s enjoyed will not return. Of course, this used to be uncommon in Silicon Valley too. Before the dotcom boom of the 90s, back when Silicon Valley actually made silicon, engineers were paid fairly well, but not appreciably better than other professionals. You became an engineer instead of an accountant because you liked computers, not because you hoped to get rich.

Could a resurgence of hardware manufacturing lead to a larger base of more normally-paid professionals? I would like to look forward to a less unequal future for the friendly neighborhood tech bro. Certainly my home, the Bay Area, has been horribly distorted by the presence of a small clique of very highly-paid techies. It would form an ironic end to the AI future, which we all agree will lead to even higher inequality. I can’t predict what would happen if the hard talk about hard tech comes to fruition. The future of hardware has become more unpredictable than it has in decades, and it certainly seems poised to become less staid. Maybe even less boring.

If I make hardware work sound like an annoying hassle, that is because in many ways, it is. Sometimes you drop a screw into a housing and can’t reach your fingers into a tiny slot to pull it out. Sometimes you accidentally touch two wires together and blow up an expensive voltage regulator. It is full of friction, disappointingly slow, and subject to brutal reality checks–the limits of distance, space, and material cost.

That is also the best part about this work. It disconnects you from the Dreamworld of software. The world inside the screen does not seem to obey limits. It compresses distance, reproduces data without cost, and removes friction. Huge organizations work hard to make you forget that this immaterial, instantaneous, magical world relies on electricity, copper, plastic, and rare-earth metals. It is ironic that the closer we seem to get to the incorporeal Machine God, the more these real limits come into focus. Hardware work pushes you up against those real-world limits. The greatest limit we face is the limit of our own body.

Knowledge work in the age of computers privileges information-gathering above all else, and promotes alienation from the body. It denies us even the minor sensory pleasures of sketching lines with a pencil, or scratching numbers onto a pad. It turns us into fleshy appendages of computers.

I remain sanguine about the long-term trajectory of hardware work because it uses the body and the mind. I once wondered what professions still require use of the hands. Surgery? Anthropology? The list is thin. We could use more of them. If “mind and hands” changes from a slogan to an endorsement of humanity’s greatest advantage over AI agents, then hardware will find its place in the sun.

Discussion about this post

Ready for more?