There is something quietly remarkable about the people who built the foundations of modern computing. Not their technical achievements, though those are extraordinary. What is remarkable is where they came from before they ever touched a machine.
Tony Hoare, who inspired the writing of this essay, studied Classics and Philosophy at Oxford: Latin, Greek, and modern philosophy under the "Greats" program at Merton College. As a schoolboy, he read George Bernard Shaw and Bertrand Russell and harbored a youthful aspiration to become a writer. He did not encounter computing until after he graduated in 1956 and only stumbled into programming after completing his National Service in the Royal Navy, where he had learned Russian. His interest in computers was awakened not by engineering curiosity, but by a fascination with the power of mathematical logic as an explanation of the apparent certainty of mathematical truth. That is a philosopher's question, not a programmer's.
Hoare went on to create Quicksort, one of the most widely used sorting algorithms in the history of computing. He developed Hoare logic, which gave programmers a formal way to prove that their programs are correct. He introduced Communicating Sequential Processes, a language for reasoning about how concurrent programs interact. Each of these contributions changed the shape of computer science. And each of them carries the fingerprints of a mind trained not in engineering but in philosophy, in the careful construction of arguments, in the pursuit of clarity and correctness not as engineering goals but as intellectual virtues.
Hoare himself said his interests went back to the ancient philosophers Aristotle and Euclid, and that their teachings provided an excellent basis for a general understanding, even today. A book of essays honoring his 60th birthday was titled A Classical Mind, a reference to his educational background in Literae Humaniores. His colleagues described his writing as having a "cultured style." When he spoke about software design, he said there are two ways to construct it: make it so simple that there are obviously no deficiencies, or make it so complicated that there are no obvious deficiencies. That is not an engineering observation. That is a philosophical one, rooted in the same tradition of clarity and economy of expression that the classical education instills.
Tony Hoare passed away in March 2026. The field lost not just a computer scientist, but a classicist who happened to turn his attention to machines.
+++
Before Hoare, before any of them, there was Ada Lovelace.
Born in 1815 as the only legitimate child of the poet Lord Byron, Lovelace was steered deliberately away from poetry by her mother, who feared the "insanity" she associated with Byron's literary temperament. Mathematics and science were prescribed as a kind of intellectual medicine. Lovelace's mother ensured she received rigorous tutoring in logic and mathematics from some of the finest minds available, including the mathematician Augustus De Morgan and the scientific author Mary Somerville.
But the poetical inheritance could not be suppressed. Lovelace described her own approach as "poetical science" and herself as an "Analyst and Metaphysician." She did not see mathematics and imagination as opposing forces. She saw them as collaborators. When she encountered Charles Babbage's Analytical Engine, a mechanical computing device that existed only in design, she saw something that Babbage himself and every other mathematician of her era did not see: that the machine could go beyond pure calculation. She wrote that if the fundamental relations of pitched sounds in the science of harmony were susceptible to such expression, the engine might compose elaborate pieces of music of any degree of complexity or extent.
That insight, that a computing machine could manipulate any symbolic system and not merely numbers, is the founding conceptual leap of modern computing. It did not come from mathematics. It came from a mind that refused to separate the analytical from the poetical, a mind that understood machines not merely as calculators but as extensions of human creative capacity. A pure mathematician would not have made that leap. It required someone who lived between disciplines, who saw the world through both lenses at once.
Lovelace died at 36. She never saw a working computer. But her "poetical science" anticipated the entire field by more than a century.
+++
Edsger Dijkstra was born in Rotterdam in 1930. His mother was a mathematician and his father a chemist. Before he ever encountered a computer, he attended the Erasmiaans Gymnasium, one of the most prestigious secondary schools in the Netherlands, where he studied classical Greek and Latin alongside French, German, English, mathematics, physics, chemistry, and biology. He scored the highest possible marks in six out of thirteen subjects on his final exams. His early ambition was not to become a scientist. He wanted to study law and represent the Netherlands at the United Nations.
He chose theoretical physics instead, and then stumbled into programming at the Mathematical Centre in Amsterdam, where he was offered a part-time job in 1952. But the classical education never left him. Dijkstra became one of the most important figures in the history of computing. He formulated the shortest path algorithm, developed structured programming, introduced the concept of semaphores for coordinating concurrent processes, and wrote the famous letter, "Go To Statement Considered Harmful", that reshaped how programmers think about control flow. His Turing Award lecture was titled "The Humble Programmer."
What set Dijkstra apart from his peers was not only the depth of his technical insight, but the way he expressed it. Over the course of his career, he wrote more than 1,300 numbered manuscripts, known as the EWDs (Edsger Wybe Dijkstra), almost entirely by hand. They were not technical reports. They were essays, written in a prose style that his colleagues described as extraordinary. He could write about formal issues in the form of an essay, with hardly any formulas. He discussed algorithms in plain prose, deriving intricate solutions in distributed computing in a seemingly informal way. His handwriting was so distinctive that a computer font was later created from it. Colleagues who received one of his handwritten letters sometimes thought they were personal notes before realizing they were formal publications.
Dijkstra increasingly viewed programming as a mental activity, an intellectual discipline that demanded clarity of thought above all else. His courses at the University of Texas at Austin had little to do with computer science in the conventional sense. They dealt with the presentation of mathematical proofs. On the department's homepage, his research summary read simply: "My area of interest focuses on the streamlining of the mathematical argument." He was not interested in machines. He was interested in how humans think, and in building systems of thought that were honest, clear, and beautiful.
That concern for beauty, for the elegance of an argument, for the moral weight of clarity, did not come from computer science. It came from a gymnasium education steeped in classical languages and literature, from a young man who once dreamed of diplomacy, from a mind that never stopped treating writing as an act of care.
+++
Alan Turing studied mathematics at King's College, Cambridge, beginning in 1931. His formal education was entirely in mathematics and mathematical logic. At school, his teachers noted the contrast between his absorbed interest in science and mathematics and his indifference to Latin and English subjects. He was not, by any standard account, a humanities student.
But Turing's intellectual life was far wider than his formal education suggests. King's College in the 1930s was not merely a mathematics department. It was a place defined by its progressive intellectual culture, centered on figures like the economist John Maynard Keynes. Turing drew deeply from the work of Bertrand Russell, whose mathematical philosophy sits at the intersection of logic, language, and epistemology. Russell was not simply a mathematician. He was a philosopher who used mathematics to ask questions about the nature of knowledge, truth, and meaning. Turing absorbed that orientation.
When Turing published his most famous non-technical work, "Computing Machinery and Intelligence," he did not publish it in a mathematics journal or an engineering journal. He published it in Mind, a quarterly review of psychology and philosophy. The paper, which introduced what we now call the Turing test, is fundamentally a philosophical argument about the nature of thought, consciousness, and what it means to say a machine can "think." It engages with objections from theology, from mathematics, from Lady Lovelace's own writings, and from the philosophy of mind. Turing was not building a machine in that paper. He was asking what it means to be human, and whether the boundary between human thought and machine process is as firm as one might assume.
Turing's lasting fascination with problems of mind and matter was present throughout his life. He thought about consciousness, about biology, about morphogenesis; the process by which organisms develop their shapes. In the final years of his life, he was working on "The Chemical Basis of Morphogenesis," a paper that founded modern nonlinear dynamical theory. He moved fluidly between mathematics, philosophy, biology, and what we would now call cognitive science. He did not respect the boundaries between disciplines because the questions he cared about did not inherently respect them either.
His schoolmasters wanted him to have a "well-balanced education." He resisted. But the philosophical questions found him anyway, and they produced some of the most consequential thinking in the history of human knowledge.
+++
Grace Hopper arrived at Vassar College in 1924 to study mathematics and physics. Vassar was, and remains, a liberal arts college, and Hopper's education there was not confined to her concentrations. She took courses in economics, public finance, botany, physiology, geology, and electronics. She graduated Phi Beta Kappa in 1928 and went on to earn a master's degree and a PhD in mathematics from Yale. She returned to Vassar to teach, and her colleagues described her as both a great teacher and a sharp wit at the luncheon table.
During the war, she joined the Navy and was assigned to work on the Harvard Mark I, one of the earliest electromechanical computers. After the war, she turned down a full professorship at Vassar to keep working with computers. It was in this second career that her humanities sensibility became most visible.
Hopper's defining insight was not mathematical. It was humanistic. She realized that programming in mathematical symbols and machine code was a barrier that excluded the vast majority of people from using computers. She believed that computers should speak to people in English, not in the language of mathematics. When she proposed building a compiler that could translate English-language instructions into machine code, she was told flatly that she could not do it because computers did not understand English. She did it anyway.
The result was FLOW-MATIC, the first programming language to use English words, and later COBOL, which became the dominant language of business computing for decades. Hopper said it plainly: "Manipulating symbols was fine for mathematicians but it was no good for data processors who were not symbol manipulators. Very few people are really symbol manipulators. If they are, they become professional mathematicians, not data processors. It is much easier for most people to write an English statement than it is to use symbols."
That reasoning is not technical. It is an observation about human beings, about how people actually think and communicate, about the gap between expert knowledge and ordinary capability. It is the kind of observation that comes from someone who studied broadly, who taught across disciplines, who invented a mythical country to bring a dry mechanical drawing class to life for her students. Hopper understood that the value of a computer depended not on what it could compute but on who could use it. That is a humanities insight dressed in technical clothing.
When she accepted the National Medal of Technology, Hopper said that the accomplishment she was most proud of was all the young people she had trained over the years; that it was more important than writing the first compiler. She understood that technology lives and dies by its relationship to people.
+++
Dennis Ritchie grew up in Summit, New Jersey, the son of a Bell Labs scientist. He attended Harvard, where he studied physics as an undergraduate and applied mathematics as a graduate student. His education was entirely technical. He completed a draft of his PhD thesis on subrecursive hierarchies of functions but never formally received the degree. He described his own trajectory with characteristic dry humor: "My undergraduate experience convinced me that I was not smart enough to be a physicist, and that computers were quite neat. My graduate school experience convinced me that I was not smart enough to be an expert in the theory of algorithms and also that I liked procedural languages better than functional ones."
Ritchie went to Bell Labs in 1967 and stayed for forty years. There he created the C programming language and, with Ken Thompson, the Unix operating system. These two creations are arguably the most influential artifacts in the history of software. C is the ancestor of nearly every widely used programming language today. Unix and its descendants power the majority of servers, phones, and embedded systems on Earth. The conceptual DNA of modern computing flows directly through Ritchie's work.
Ritchie had no formal humanities training, but his colleagues paint a picture of someone whose intellectual life extended far beyond programming. He was widely read and was interested in classical music. He was knowledgeable on a broad range of topics. One colleague recalled being at the San Diego Zoo with Ritchie and discovering that he knew the details of seemingly every animal they encountered. He was described universally as modest, kind, and generous, always giving credit to others while downplaying his own contributions.
What is most telling about Ritchie's humanities sensibility is his writing. The book he co-authored with Brian Kernighan, The C Programming Language, is still considered one of the finest technical books ever written, more than four decades after its publication. It is precise, economical, and elegant. Every sentence does work. There is no bloat, no unnecessary complexity, no showing off. The book teaches not just C, but a way of thinking about programming that values clarity and simplicity above all else. That is not a technical achievement. That is a literary one.
Ritchie was also shaped by Bell Labs itself, which in its golden era was one of the most extraordinary intellectual environments ever assembled. Physicists, mathematicians, linguists, and engineers worked side by side. The culture valued breadth, curiosity, and the freedom to pursue ideas across disciplinary boundaries. Ritchie did not study humanities in school, but he spent his career in a place that embodied the humanities ideal of cross-pollination between fields.
Dennis Ritchie died in October 2011, the same week as Steve Jobs. The world mourned Jobs loudly. Ritchie's death was quieter. One commentator noted that Ritchie's work played a key role in spawning the technological revolution of the last forty years, including technology on which Apple built its fortune. The tools that made the modern world possible were built by a quietly brilliant man who read widely, wrote beautifully, and never sought the spotlight.
+++
Brian Kernighan was born in Toronto and studied engineering physics at the University of Toronto, then earned his PhD in electrical engineering from Princeton. Like Ritchie, his education was purely technical. He spent thirty years at Bell Labs, where he contributed to the development of Unix, co-authored The C Programming Language, and created the AWK programming language.
But something interesting happened when Kernighan moved from Bell Labs to Princeton's Computer Science department in 2000. He began to gravitate, with increasing purpose, toward the humanities.
He started teaching a course called "Computers in Our World," designed not for computer science majors but for students studying literature, politics, history, and other humanities disciplines. He co-taught a course with a professor from the French and Italian department and he co-taught another with a professor from the English department on "data in the humanities." He became involved with Princeton's Center for Digital Humanities. He ran independent work seminars where computer science juniors explored data from the humanities, looking at properties of books, history, social interactions, and patterns in human behavior. A New York Times profile of him was titled, "To the Liberal Arts, He Adds Computer Science."
This is a man who spent three decades building some of the most important systems-level software ever written and then spent the next two decades trying to connect that work back to the humanities. He has said in interviews that Princeton stresses breadth over depth, and that his job as an advisor is to get students to sample the amazing collection of different things that humankind does. He encourages computer science students to explore broadly because he believes exposure to those other disciplines is part of what makes a complete thinker.
Kernighan did not arrive at computing through the humanities. But his career arc suggests that he recognized something was missing, that technical excellence alone was not enough, and he spent the second half of his professional life trying to build the bridge he never had as a student. That search is itself a kind of evidence. When one of the most accomplished systems programmers in history spends his later years teaching literature majors about computers and computer scientists about poetry, it tells you something about what the field needs and does not have.
+++
Richard Stallman graduated from Harvard in 1974 with a bachelor's degree in physics, magna cum laude. He had been programming since high school and had been working at MIT's Artificial Intelligence Laboratory since his freshman year. His formal education was entirely in the sciences. He chose physics over mathematics for a pragmatic reason: the physics degree did not require a thesis.
Stallman's contribution to computing is not primarily technical, though he is a gifted programmer who created GNU Emacs, the GNU Compiler Collection, and the GNU Debugger. His lasting contribution is philosophical. He founded the Free Software Foundation and articulated a moral framework for software that treats user freedom as a fundamental ethical principle, not a business strategy. He argues that the ability to study, modify, and share software is a human right, and that proprietary software is antisocial and unethical. He has framed control over software as a civil liberties issue, speaking and writing about it for decades with the intensity of a moral philosopher.
Stallman is an interesting case for this essay because his thinking is deeply humanistic while his training is not. He arrived at ethical philosophy through personal conviction and lived experience, not through formal study. The free software movement is, at its core, a philosophical argument about freedom, autonomy, and the relationship between creators and users. It draws on traditions of thought that belong to ethics and political philosophy, whether Stallman studied them formally or not.
But there is something instructive in the gap between the depth of Stallman's ethical insight and the way he has communicated it over the years. His positions are often correct and almost always principled. His delivery has frequently been abrasive, uncompromising, and alienating to potential allies. Eric Raymond, one of the founders of the adjacent open-source movement, argued that Stallman's moral arguments, rather than pragmatic ones, alienate potential allies and hurt the end goal. Whether one agrees with Raymond or not, the observation points to something worth considering: having the right ethical instincts is not the same as having the skill to communicate them in ways that move people. The humanities do not just teach you what to care about. They teach you how to bring others along.
+++
There is a pattern in these stories, and it is not subtle.
The earliest figures in computing, Lovelace, Turing, Hoare, and Dijkstra, were either formally educated in the humanities or deeply immersed in philosophical thinking as a central part of their intellectual lives. Lovelace called herself a poetical scientist. Hoare studied Latin, Greek, and philosophy before he ever wrote a line of code. Dijkstra studied classical languages and wanted to be a diplomat before he became a programmer. Turing published in philosophy journals and spent his life asking questions about consciousness and the nature of thought.
The next generation, Hopper and Ritchie, came through educational environments that still valued breadth. Hopper's Vassar education exposed her to economics, botany, physiology, and geology alongside mathematics. Ritchie's Harvard and Bell Labs years immersed him in a culture where intellectual curiosity across disciplines was the norm, not the exception.
By the time we reach Kernighan and Stallman, the humanities influence has thinned to almost nothing in their formal education. Kernighan studied engineering physics. Stallman studied physics. Both are brilliant. Both are consequential. But Kernighan has spent the second half of his career actively seeking out the humanities connection he never had, and Stallman's philosophical contributions, while profound, have sometimes suffered from a lack of the communicative skill that humanities training cultivates.
And after them? The pattern does not continue. It ends.
Look at the technology leaders and influential engineers of the last twenty years. Look at the people building the platforms, the social networks, the AI systems, the attention economies. How many of them studied philosophy? How many of them read classical literature or engaged seriously with ethics or political theory or the history of human thought? How many of them could write an essay in the style of Dijkstra, or articulate a vision of human-centered computing like Hopper, or reason about consciousness and moral responsibility like Turing?
The answer is vanishingly few. And we can see the results.
+++
This is not an abstract concern. The absence of humanities thinking in modern technology has consequences that are felt by real people in their daily lives.
Social media platforms were designed by engineers who optimized for engagement metrics without understanding, or perhaps without caring, what the deliberate manufacture of outrage and addiction does to a human mind. Attention economies were built by people who studied computer science and business but never studied propaganda, rhetoric, or the psychology of manipulation. Algorithms that determine what billions of people see, read, and believe every day were written by teams that included no one trained in ethics, epistemology, or the philosophy of information.
The result is a generation of technology that is technically sophisticated and humanistically bankrupt. We have systems that can predict what you want to buy but cannot be bothered to consider whether showing a teenager an endless feed of curated suffering is a good idea. We have AI models trained on the full breadth of human knowledge being deployed by people who have never seriously asked what responsibility comes with that power. We have apps that children are required to use for school that crash, load slowly, and harvest data, built by engineers who never stopped to think about the family on a slow internet connection whose only concern is whether their kid got to school safely.
Alan Turing was asking whether machines could think in 1950. He published that question in a philosophy journal because he understood it was a philosophical question, not a technical one. Seventy-five years later, we are building machines that increasingly influence how humans think, and the people building them have largely abandoned the disciplines that would help them understand the gravity of what they are doing.
Lovelace saw that computing could touch music, language, and art. She saw that because she lived between poetry and mathematics and refused to choose. Hopper saw that programming had to speak human language because she understood that most people are not symbol manipulators. Dijkstra wrote essays instead of technical reports because he believed clarity of expression was a moral obligation. These were not decorative sensibilities. They were the insights that shaped the field itself.
When you strip the humanities out of the people who build technology, you do not get more efficient engineering. You get engineering that has lost its sense of purpose, its awareness of consequence, and its ability to see the people on the other side of the screen.
+++
None of this is meant to suggest that every programmer needs a philosophy degree or that studying Latin will make you better at writing Go. The point is not about credentials. It is about disposition.
The people who built the foundations of computing shared something that went beyond technical brilliance. They had a concern for clarity that extended past code into how they communicated with other human beings. They had a sense that their work existed in a larger context, that the systems they built would be used by people whose lives would be shaped by those systems. They asked not just "does it work" but "is it good," and they understood that "good" meant something more than functional correctness.
That disposition does not require a degree. But it does require more than casual interest. Reading a philosophy book on a weekend is a hobby. Spending serious, sustained time with literature, with history, with ethics, with the long conversation that humanity has been having with itself for thousands of years about what it means to live well and treat each other justly, that is something different. It changes how you see problems. It changes what questions you think to ask. It changes what you are willing to ship and what you refuse to.
Higher education bears some responsibility for the current state of things. Computer science programs at most universities require little to no engagement with the humanities. Students can graduate with the technical skills to build systems that affect millions of lives without ever having been asked to seriously consider what that responsibility means. But blaming institutions only goes so far. The choice to engage with the humanities is ultimately a personal one, and it is available to anyone with a library card, an internet connection, and the willingness to sit with difficult, unfamiliar ideas long enough to let them change how you think.
+++
This essay has asked you to spend some time with the stories of people who built the world you work in every day. It has asked you to slow down, to read about their lives, to notice that the things you admire most about their work, the elegance, the clarity, the concern for people, the refusal to accept unnecessary complexity, came not from their technical training but from something deeper and older.
If you are a technically oriented person and you have made it this far, you have already demonstrated something. You have invested time in a piece of writing that does not teach you a new framework or show you a clever optimization. You have sat with ideas that do not have immediate practical application. You have, for a few minutes, done the thing that this essay is asking you to consider doing more of.
The founders of your field read philosophy and wrote essays and studied classical languages and asked questions about consciousness and beauty and the nature of human thought. They did not do these things because they had spare time or because someone made them. They did them because they understood, implicitly or explicitly, that building machines worthy of human trust requires understanding what it means to be human.
That understanding is not going to come from another tutorial, another side project, or another Hacker News thread. It is going to come from the long, slow, sometimes difficult work of engaging with the humanities. Read philosophy. Read literature. Read history. Study how humans have thought about ethics, about power, about communication, about what it means to build something that lasts. Sit with those ideas long enough to let them reshape how you approach your own work.
The tradition that produced Lovelace's poetical science and Hoare's classical mind and Dijkstra's handwritten essays and Hopper's insistence that computers must speak to people is not dead. But it is diminishing, generation by generation, as the field grows further from its roots. You have the opportunity to reverse that. Not by going back to school, necessarily, but by taking the humanities seriously enough to let them into your professional life, your design decisions, your code reviews, your conversations with the people who will use what you build.
The machines we build reflect the minds that build them. If those minds are nourished only by technical knowledge, the machines will be technically competent and humanistically hollow. If those minds are broad, curious, and grounded in the long tradition of human thought, the machines will be something better. They will be worthy of the people who depend on them.
The choice, as it has always been, is yours.