The Sony FD-30A has a very weird display:
On the surface, it looks like a normal CRT, except that it's impossibly thin: the whole device is just under 4 cm thick. To do this, the tube is mounted sideways, and the phosphor is viewed from the back.
Unfortunately, this rather unique display is completely useless: There are no analog TV stations on my continent... at least not with that attitude.
To get started, I grabbed my favorite 8-bit microcontroller, the AVR128DA28. (soon to be unaffordable due to its whooping 16k of onboard RAM)
The CPU has a maximum clock frequency of 24 MHz, but everything else maxes out at 12 MHz. Because an IO pin can only be toggled on a clock edge, the maximum output frequency is 6 MHz.
That's... not great. The lowest my TV can tune is 45 MHz. However, the microcontroller's output is a square wave, which contain frequency components at odd multiples of the nominal frequency.
These harmonics go quite far up the spectrum: a 6 MHz square wave can easily be picked up by a receiver tuned to 198 MHz (33rd multiple) several meters away.
Because of this, the microcontroller can create a signal in the (VHF) television band by toggling a pin. Video signals are inverted, so the output should be enabled to draw black, and disabled to draw white.
... except that doesn't quite work:
TVs need darker-then-black synchronization pulses to know when to begin each line and frame. The amplitude of these pulses must be higher then anything in the image so — even for a black and white image — the signal needs to have at least three levels.
The easiest way to do this is with two resistors:
By changing which pins have the 6 MHz square wave and which ones are grounded, the MCU can produce 4 different RF powers:
| PA1, PA2 | Output | Color |
|---|---|---|
| RF, RF | 1 VCC | Sync pulse |
| GND, RF | 2/3 VCC | Black |
| RF, GND | 1/3 VCC | Gray |
| GND, GND | 0 | White |
Because analog TVs don't have any storage, the video signal has gaps during which the electron beam returns to the top of the CRT. During these gaps, the CPU doesn't have any transmitting to do, so I decided to run Conway's game of life:
It isn't a game in the conventional sense. It has a grid of square cells which can have one of two states: "dead" or "alive". The game progresses by applying three rules to the grid:
If a living cell dies if it has fewer than 2, or more then 3, living neighbors. (includes diagonals) If a dead cell has exactly 3 neighbors, it comes to life.
Over time, these rules give rise to complex behavior: some patterns don't do anything, but others oscillate, move or even replicate. Simulating a random starting grid makes a good "screensaver", but some interactivity would be nice. To fix this, I added a small keypad which can be used to draw patterns:
The image is surprisingly good for how hacky the transmitter is, and can be received ten meters away — a testament as to just how noisy digital electronics are.
The switching frequency doesn't matter: what's important is the signal's rise time. Any microcontroller project with long wiring will be spewing junk in the hundreds of MHz or even low GHz.
The only difference is that my circuit carefully controls the interference to send information... but with a specially designed receiver, it's possible to snoop in on almost any circuit.
Important considerations:
The old TV bands aren't empty: The circuit is unlikely to cause any interference due to its insignificant output power, but other transmitters can interfere with it. Depending on your local conditions, you might not be able to get it working at all.
My TV hasn't been adjusted in decades, and microcontrollers don't have super accurate clocks. You might have to adjust the timings in the code.
... it also has continuous tuning: If your doesn't (tuning dial clicks), you will have to mess around with the OSCHFTUNE register to make it work. A spectrum analyzer would be very helpful.