Settings

Theme

Williams tube – cathode ray tube used as computer memory

en.wikipedia.org

57 points by silent90 10 years ago · 16 comments

Reader

Animats 10 years ago

The early days of electronic computing involved desperate attempts like this to build a fast temporary storage device. Things were in better shape on the arithmetic and control side - tubes worked, although it took some time to get component reliability up to an acceptable level. ENIAC, Colossus, and the IBM 603 Multiplier all had very limited memory, under 100 numbers.

This was a huge problem for anything that needed even a little storage. Western Union built Plan 55-A, a message switching system for telegrams, sort of like Sendmail. The buffering was all paper tape punches feeding paper tape readers. This was slow, expensive, and required large buildings full of paper tape gear and a huge staff. This was used from 1948 to 1976. The whole network of US paper tape gear switching centers was then replaced by one mainframe computer.

Today, we think nothing of software that needs a few gigabytes to display a text message.

  • rootbear 10 years ago

    As I've studied the history of the early first generation computers, this problem became increasingly clear. Computers were held back by the need for a main memory that was reasonably fast, reliable and affordable at the size needed. Magnetic core was a godsend. Some of the first computers I used had core memories and I have sometimes said to younger coworkers that I got into computers when individual bits were big enough to see.

    Others have mentioned that with Williams tubes, you could actually see the memory. That wasn't always the case, as not all Williams tubes had phosphor faces. A tube with a phosphor face was sometimes wired in parallel with the Williams tube to provide a visible copy of the data. The Wikipedia article on Williams tubes explains this:

    https://en.wikipedia.org/wiki/Williams_tube

  • eru 10 years ago

    Business computing had always used more memory. IBM had very interesting devices with punch cards.

    It's interesting to compare these two big strands: business vs scientific computing.

    • Animats 10 years ago

      I've made the point before that, even without Colossus, Turing, or ENIAC, electronic computing would have been developed by IBM, one step at a time, for business computing. IBM came out with the IBM 601 mechanical multiplier as a punch card machine in 1931. At last, businesses could multiply order quantity times price per unit to get a total price, and invoices could be generated without manual arithmetic. IBM slowly ground forward, with the IBM 602 in 1946, which could both multiply and divide. Next was the 602A, a more reliable 602. The 602A was used for some engineering work. But it only had storage for 6 numbers, and took several seconds to do a multiply.

      Then IBM introduced the IBM 603, in 1946. This used vacuum tubes, and did the same job as the 602, but much faster. IBM built 100 of those machines, to find out if they could put electronics in the field, service it, and keep it working reliably. It worked, so they followed up with the IBM 604, in 1948. This was a better 603, with more memory, more plugboard-programmed steps, and more compact packaging.

      Then came the IBM 605, in 1949. This was a 604 with I/O capability. It was possible to cable together an IBM 605, an IBM 527 punch, an IBM 412-418 tabulator with card reader and printer, and some IBM 941 storage units (16 numbers each!), creating a Card Programmed Calculator. This was almost a programmable computer.[1] The program had to be on cards, and was executed as the cards were read, one at a time. Up to 10 instructions could be stored internally, allowing small loops, but otherwise you could only execute the program once before putting in more cards. There were still plugboards involved, lots of them; each machine had its own plugboard, although there were general-purpose plugboards for card-programmed mode. The whole CPC setup was clunky, but it got work done.

      By now it was obvious that everything would be much easier if the program could be stored in some memory device. There were already a few "giant computers" by this point, including the IBM 701 Defense Calculator, but they were very expensive and only for huge organizations or the military. In 1953, IBM came out with the IBM 650, which was a real computer with the program and data stored on a drum rotating at 12,500 rpm. Each instruction specified the address of the next one, so you could get multiple instructions per rev by careful positioning. Don Knuth wrote an angular-optimizing assembler for the IBM 650, called SOAP III. The IBM 650 was the first mass produced computer - about 2000 were made. It was affordable by mid-sized businesses and was a big commercial success. Williams tube machines were faster, but far more expensive per bit.

      This shows how much of a limitation memory devices were in the early days. Electronic arithmetic without moving parts was in a production product by 1946, but memory was very limited for years afterward. (ENIAC had storage for 20 10-digit numbers. Colossus wasn't a general-purpose computer, it was a key-tester, like a Bitcoin ASIC.) It was clear early on that stored-program computers were the way to go, but the memory cost had to come way down before programs could be stored.

      Von Neumann, in his classic "Report on the EDVAC"[2], put the whole architecture together, describing what to do once some decent memory devices became available. It's an interesting read today, one of the basic documents of the history of computing. Von Neumann concludes that about 2^18 bits of memory, organized as 8K of 32 bit words, was needed to get useful mathematical work done. The program was expected to occupy about 1.5K words, with the rest reserved for data. Then he looks at the delay line problem - the delay line can store large numbers of bits, but the more it stores, the longer the access time. More delay lines in parallel help on speed, but increase the machine size and cost. Based on how long it takes to do a multiply, he proposes having 256 delay lines storing 1Kb each. The actual machine built was smaller - 1,000 44-bit words, made up of 128 delay lines. Memory cost was still a very big problem.

      Computing remained memory-limited for decades. Core memory was a million dollars a megabyte in 1970. Superminis of the early 1980s might have 4MB. PCs had 1MB. Not until the 1990s did memory get cheap, and stopped dominating the cost of a computer.

      [1] http://www.mirrorservice.org/sites/www.bitsavers.org/pdf/ibm... [2] https://sites.google.com/site/michaeldgodfrey/vonneumann/vne...

      • eru 10 years ago

        Awesome comment. Thanks!

        Where do I sign up for your newsletter?

saljam 10 years ago

This was part of The Baby, AKA the Manchester Small-Scale Experimental Machine at the University of Manchester in the late 40s. Modern replicas and simulators can be found at the university and Manchester's Museum of Science and Industry. It was built by Williams, Kilburn, and Tootill. Alan Turing wrote some programs for this machine.

One nicest things about this is that your memory is visible. You can just see which bits are on and which bits are off by looking at the screen. No debugger needed!

One of the least nice things was that these tubes had a reputation of being horribly unreliable.

  • abraae 10 years ago

    Visible memory - useful indeed. As a newly minted mainframe engineer at IBM I worked with a guy who could read from punched paper tape by eye. It was most impressive watching him pulling it through his hands at a pretty good pace, reading it out as he went.

    • tomcam 10 years ago

      I worked with Tim Paterson (creator of MS-DOS) when I worked on the Visual Basic team at Microsoft. When he worked on code generation he just read hex dumps, not disassembly, because it was faster for him.

      • ams6110 10 years ago

        It doesn't take that long to learn when you have to do it. When I had to write an assembler as an undergrad project I became pretty much able to read 6809 machine code directly.

        • tomcam 10 years ago

          32-bit instruction set was much bigger with many more addressing modes

srimech 10 years ago

The variety of memory used for computers is amazing, and I find it a lot more interesting than the history of the CPU. CPU architecture is very diverse, but physically tends to be just different configurations of a few physical elements - relay, valve or transistor.

Memory had cathode ray tubes, mercury delay lines, dekatron and selectron tubes, core store, core rope, magnetic drums, tapes and discs, magnetostrictive delay lines, magneto-optical discs, and probably many more I don't know about, not counting the write-once formats like punched card and tape.

  • agumonkey 10 years ago

    Random anecdote: a teacher implied at one point some project used a satellite link as RAM because they had no other choice, I don't know more but it's curious nonetheless.

  • 13of40 10 years ago

    One of my favorite backwaters of technology is the work they did in the 50's and 60's with "cryotrons", which were essentially like superconducting transistors that switched by one superconductor temporarily destroying another's superconductivity. They apparently even made a working memory module out of them and provided it to the NSA before the transistor came along and changed everything.

venti 10 years ago

There is an art installation that uses cathode ray tubes: http://www.alpha60.de/art/love_letters/

I guess that they probably don't use the tubes as the actual memory. But it looks extremely cool.

rberger 10 years ago

My first job was at the startup Micro-bit in the mid 1970's We were building Electron Beam Access Memory systems as a "Cache" memory between Core and Disk memory for mainframes.

It was an 18 bit wide memory so it had 18 CRT tubes and was leveraging electron microscope techniques to increase density. It had two steps of electron beam steering.

I remember the founder walking around talking about our main competitor "Bubble Memory" which also never got very far.

http://www.computer.org/csdl/mags/co/1975/02/01649340.pdf

geon 10 years ago

Pretty high data density for the time.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection