The Little-Known Genius Who Helped Make Pixar Possible
wired.comThis classic of his taught me much about rendering: http://alvyray.com/Memos/CG/Microsoft/6_pixel.pdf
However, if we're going to talk about actual genius, no one even comes close to Eric Veach; he's like the von Neumann of rendering. Fun bit of trivia: after he finished completely revolutionising all of rendering with his PhD thesis[0], he went on to develop a little thing called AdWords.
[0] Robust Monte Carlo Methods for Light Transport Simulation - https://graphics.stanford.edu/papers/veach_thesis/
> he went on to develop a little thing called AdWords.
So he really is like the von Neumann of rendering!
https://en.wikipedia.org/wiki/John_von_Neumann#Nuclear_weapo...
I don't understand his point:
> A pixel is a point sample. It exists only at a point.
Wouldn't that completely depend on where the pixel comes from?
For a rasterised vector drawing, surely each pixel will get its colour from the whole area it represents?
If it's a downsampled photo, each pixel would correspond to several points.
And in digital cameras, does a pixel really represent a "point" in space? It seems that in a 10 megapixel camera, you'd really ideally want each sensor pixel to capture all the light that hits that 1/10 millionth part of the sensor. That way you maximize light intake, and capture more information than if you merely sample "a single light ray" in the center of teach sensor pixel.
So he is an evil genius.
Thats a bit unfair. Nobody knew the implications of AdWords or online advertising in general when it began.
Edit: I was wrong, read mistrial9 and chubots comments.
this is flatly false -- I was a participant on academic networks in the early 90s, and on the proto-Internet in 1995 and onwards.. a primary means of communication between coders at that time was Usenet, which I read and posted to daily.. Programmers discussed at length, repeatedly, for years, about the ethics and social implications of commercializing the networks, about owning domains personally and re-selling them, and about using ads, ad placement, commercials, micro-payment systems and others.
The general sentiment at the time was that it was rude, crude and unacceptable to monetize the open networks with ads, as it was clearly a "commons" and every single Western person was familiar with bill-boards along a highway, and endless commercials on radio and television.
It is quite interesting today to see the name of the man who "invented adwords" as I had not seen it even once before.
Yeah, also the original 1998 paper that describes the architecture of the Google Search Engine calls this out explicitly:
http://infolab.stanford.edu/~backrub/google.html
Currently, the predominant business model for commercial search engines is advertising. The goals of the advertising business model do not always correspond to providing quality search to users. For example, in our prototype search engine one of the top results for cellular phone is "The Effect of Cellular Phone Use Upon Driver Attention", a study which explains in great detail the distractions and risk associated with conversing on a cell phone while driving. This search result came up first because of its high importance as judged by the PageRank algorithm, an approximation of citation importance on the web [Page, 98]. It is clear that a search engine which was taking money for showing cellular phone ads would have difficulty justifying the page that our system returned to its paying advertisers. For this type of reason and historical experience with other media [Bagdikian 83], we expect that advertising funded search engines will be inherently biased towards the advertisers and away from the needs of the consumers.
That's pretty much exactly what happened to Google (and the entire web honestly) in the last ~10 years. I'd say that Google was doing a very admirable job of avoiding this bias from 1998-2008 or so -- both search and ads were good.
But basically the founders moved onto other things, handed search and ads over to VPs, pre-IPO employees left, etc. and the result was predictable. In large enough numbers, people behave according to incentives... it's basically "physics".
-----
Also I think Go.com is notable for inventing paid search -- it wasn't Google. There was a book that covered this history in detail -- I think it was one of the many books about Google.
I don't think it's that cut and dry. Before adwords, it was already known advertising was a scummy business, accused of indoctrinating children and the like.
> the unlikely mecca of the field, the University of Utah
The story is that David C. Evans was made head of Utah's CS department in 1966, having worked on Berkeley's timesharing research, Project Genie, and having been born in Salt Lake City. He applied for graphics funding from the Arpa-backed Information Processing Techniques Office (IPTO) which funded practically all the major computing leaps of the 1960s including the Arpanet, chasing the vision of JCR Licklider's "man-computer symbiosis". Not only did the IPTO stump funds, but they decided to make Utah an IPTO centre of excellence in graphics. Furthermore, in 1968 they sent down Ivan Sutherland, a former IPTO director who invented the line drawing program Sketchpad for his PhD under the supervision of Claude Shannon, down to Utah to join Evans. Sutherland accredited Evans with being "The man who figured out the basic techniques that made modern computer graphics possible".
So bear in mind that Alvy Ray Smith and thereby Pixar stood very much on the shoulders of giants. And that the foundations of computer graphics were paved by a serious amount of public spending.
14 years ago, Alvy Ray Smith wasn't listed with the other people in Wikipedia's Pixar navbox. I added him a couple of times but someone kept deleting him without comment. I found it so odd.
https://en.wikipedia.org/w/index.php?title=Template:Pixar&di...
Thankfully someone else seems to have added him back in and it has stuck now.
It was probably Steve Jobs who did that.
Happily there's a change record that demonstrates that wasn't, unsurprisingly, the case.
I love these histories of technology. The people and the times are fascinating. I never watched Toy Story growing up. I was in high school and it seemed like a child’s movie so I wasn’t interested in it. Now, 25+ years later, I have daughter who is of the age (2.5) to understand it. Recently we did a marathon watching all movies over a couple days. It’s shocking to see how the animation improves over the years but even more astonishing is how good the original one was for being made back in 1995. I can only imagine how magical it felt back then. As the article points out, algorithms and raw computation is necessary, but not sufficient, for success. Creativity and artistry are required to truly be great.
He wrote the book, "A Biography of the Pixel" [1], which should be an interesting read and is mentioned in the article.
A promo for the book was discussed here when the book was published last month:
> As Smith recalls it, Jobs began mocking Smith’s Southwestern accent. “I had never been treated that way. I just went crazy,” Smith says.
Texan here. That is just beyond the pale. What a complete and absolute jerk.
Unfortunately, this does sound exactly like something Jobs would do. He was a jerk.
Of topic, but reading the headline and the intro about Alvy Ray Smith I was half convinced this was going to be a satirical article about how he invented raytracing and that it is actually named after him; in similar vein as the “Running was invented in 1784 by Thomas Running when he tried to walk twice the same time” meme [0]
Yeah, it's funny how he actually ended up as a "ray smith" for a living. And even the "Alvy" part means something graphics related in some obscure language iirc. He wrote about that on his homepage, but I can't find it now - his homepage (at http://alvyray.com) is a bit... special.
Similarly, being good enough was invented by John B. Goodenough when he invented the lithium ion battery, which was good enough.
His first name is ALvy and he invented ALpha channel.
Porter+Duff's 1984 Compositing Digital Images paper paints a slightly different picture:
https://graphics.pixar.com/library/Compositing/paper.pdfCredit should be given to Ed Catmull, Alvy Ray Smith, and Ikonas Graphics Systems for the existence of an alpha channel as an integral part of a frame buffer, which has paved the way for the developments presented in this paper.
It must be approaching 40 years ago now, but I remember watching a BBC TV weekly documentary called Horizon that was all about the fledgling computer graphics industry - I think the episode was called Painting by Numbers. Alvy Ray Smith was interviewed along with other luminaries like Ed Catmull, Loren Carpenter, Jim Blinn & Nelson Max.
I must have watched it 30+ times via my parents’ wheezy old VHS player and would dearly love to see it again but never managed to find it.
Being so enthralled and exhilarated by the magical things they were doing seems like it should have led me into the same industry but I was never clever enough to do so. I still get a thrill when their names come up in articles like this though. Thanks for sharing.
Oh my goodness. Thank you so much. I cannot wait to watch it again.
Truly this was a wonderfully written article about Alvy Ray Smith.
> ..You are not looking at pixels on your screen but the expression of those pixels. The pixel itself? That’s just an idea.
At the end I realized it's written by Steven Levy, who is the author of "Hackers: Heroes of the Computer Revolution" and other books on the history of computers.
I love steven levy. He is a national treasure.
What's really crazy is that Alan Kay is involved, AGAIN! He's the 20th Century's wizard behind the curtain.
An interesting bit of history! I will have to pick up Smith’s book soon.
Also very, very much recommended: "Droidmaker: George Lucas And the Digital Revolution", which features Alvy Ray Smith a lot.
It has a homepage at https://www.droidmaker.com
I don’t have much to say about the article, I’m just glad it wasn’t a headline joke about Steve Jobs.
It is a joke about the article NOT being about Steve Jobs.