Settings

Theme

Aristotle's influence on computational thought

theatlantic.com

53 points by sharmajai 9 years ago · 6 comments

Reader

zeteo 9 years ago

> The history of computers [...] is better understood as a history of ideas, mainly ideas that emerged from mathematical logic

I have an advanced degree in CS and have done my fair share of theory, but ideas from logic are the cherry on the cake and did not create the computer. The real unsung heroes are those who invented relays, amplifiers and punched card machines. Babbage had a design in 1840, Boole had a theory in 1850; but without components neither of them made practical impact, and became nearly forgotten.

Once relays etc. became widely available in the 1930s it was anyone's game. Konrad Zuse built his first machine in his parents' apartment [1]. Feynman did complex computations at Los Alamos with punched card machines. And, as the article mentions, Shannon created a theory of relay circuits because they were already building complex circuits.

[1] https://en.wikipedia.org/wiki/Konrad_Zuse

  • maxander 9 years ago

    A large part of this is, though, that the groundbreaking mathematical insights that paved the way for computers are things that any modern ten-year-old takes for granted. The idea that we could design a single set of fundamental computational steps (a CPU instruction set) that could execute any program was key; that allowed people to abstact over all the different "math engines people have made out of whatever compontents" and create the unified concept of "a computer."

    More concretely- you can definitely get an advanced CS degree without seeing much math, since CS is a giant and heterogeneous field. But if you study something like language design, you quickly run into mathematics that Boole would have felt very comfortable with.

  • woodandsteel 9 years ago

    When philosophers are successful, they produce ideas that are quite revolutionary, but are so obviously right that they become the implicit common sense of later generations, and as a consequence most people don't even know that they were created by an earlier set of geniuses.

maxander 9 years ago

> Shannon himself encountered Boole’s work in an undergraduate philosophy class. “It just happened that no one else was familiar with both fields at the same time,” he commented later.

Never underestimate the power of being one of the few people to have a particular set of individually-common skills. If you know molecular biology and group theory, or chemical engineering and architecture, or web design and sign language, or whatever and whatever, there's special opportunities open to you and almost noone else.

chis 9 years ago

I honestly don't think that machine learning deserves to be on this timeline. Despite the hype, we haven't seen much indication that it's anything other than a fad, propped up by its ability to do more complex statistical regressions than ever before thanks to modern hardware.

Maybe ML truly is our best shot right now, but juxtaposing it with the massive leap that computable logic represented makes it seems pretty inconsequential.

woodandsteel 9 years ago

I am glad that someone is explaining how Aristotle is the great-grandfather of modern computing.

You know, when you look at the histories of the various fields of study in the modern university, it's quite remarkable how many of them got their start, or at least a large part of it, with something Aristotle wrote 2,300 years ago.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection