Richard Hamming’s contributions to digital error correction laid the foundation for how modern computing systems detect and correct mistakes in data.

Richard Hamming in the Naval Postgraduate School lab with the IEEE Hamming Medal in 1988. Image used courtesy of ACM
But like many key figures in electronics history, his path from mathematics to computer engineering involved a series of detours, breakthroughs, and one very frustrating weekend.
From Math to Machines
Born in 1915 in Chicago, Hamming had set his sights on engineering until the Great Depression narrowed his choices. Instead, a scholarship offer from the University of Chicago nudged him into mathematics. The university’s radical “New Plan” curriculum emphasized deep conceptual thinking across disciplines, with core sequences in physics, chemistry, biology, and the humanities.
That interdisciplinary approach to education and the mindset it created followed him through graduate studies: a master’s in Nebraska, and then a Ph.D. in 1942 from the University of Illinois. His thesis focused on boundary-value problems in differential equations—an abstract topic that would become surprisingly useful in applied computing.
After teaching at Louisville and Illinois, Hamming took an unexpected detour into history.
Joining the Manhattan Project
In 1945, Hamming joined the Manhattan Project at Los Alamos, New Mexico. While physicists like Richard Feynman tackled neutron diffusion and implosion equations, Hamming served as “chief mathematician” for the punched-card calculators that processed their math.
These relay-based IBM machines were finicky and fragile. Keeping them running was a full-time job. Hamming later described his role at the Manhattan Project as that of a "computer janitor," cleaning up computer simulations of experiments that would have been otherwise impossible to run in a laboratory. Hamming's wife, Wanda, also worked at the lab, initially as a human computer and later under Enrico Fermi’s supervision.

Richard Hamming with his wife, Wanda, in 1980. Wanda worked as a human computer on the Manhattan Project during World War II. Image (modified) used courtesy of ACM
While working on the Manhattan Project, Hamming was notably assigned to double-check a now-infamous calculation that asked whether the Trinity atomic test could ignite Earth’s atmosphere. That validation was a brush with existential risk that planted Hamming's lifelong concern with scientific responsibility.
A Weekend That Changed Computing
After the war, Hamming joined Bell Telephone Laboratories, where he worked with Claude Shannon and John Tukey. There, surrounded by early computing systems and noisy communication channels, Hamming asked a now-famous question: If machines can detect errors in data, why can’t they fix them too?
A weekend in 1947 gave him the answer. He’d set up a long computer job to run unattended. When he returned on Monday, the system had failed early in the process without alerting him. Frustrated, he developed a technique that could not only detect errors but also identify and correct them in real time. This became the Hamming code.
The concept was simple but revolutionary. By adding a set of parity-check bits, one could create redundancy in binary data. This allowed the system to determine not just if an error had occurred, but where. It meant machines could self-correct without manual oversight.
This, along with several other ideas from Hamming’s research, is still foundational to electrical engineering and computer science today. These include:
- Hamming matrix: The parity-check structure used to define the relationships between data and redundancy bits.
- Hamming distance: The number of differing bits between two binary strings. It’s now used in everything from DNA sequencing to cybersecurity.
- Hamming numbers: A sequence of “smooth” integers (made from powers of 2, 3, and 5) useful in algorithmic generation and FFT scheduling.

A two-dimensional visualization of the Hamming distance. Image used courtesy of Josiedraus via Wikimedia Commons (Public domain)
Teaching to Ask the Right Questions
Hamming stayed at Bell Labs until 1976, after which he became a professor at the Naval Postgraduate School in Monterey, California. There, he turned his focus to teaching and writing, launching his influential seminar, "The Art of Doing Science and Engineering."
He believed that students should learn how to learn. That is, instead of memorizing solutions, they should master how to ask the right questions. As he put it, “The purpose of computing is insight, not numbers.”
In lectures and books, he argued for elegance over rote mechanics. His titles, Numerical Methods for Scientists and Engineers, Digital Filters, and Coding and Information Theory, became standard references for engineers worldwide.
Hamming continued teaching until just weeks before his death in 1998. Today, the technologies he helped pioneer have become embedded in everything from smartphones to spacecraft. The IEEE named its highest award in information sciences after him—the Richard W. Hamming Medal.
His legacy lives on not only in textbooks and hardware but in the ethos of problem-solving that values clarity, creativity, and courage. Whether you’re designing a data link or debugging a signal chain, chances are you’re standing on Hamming’s shoulders.