Settings

Theme

PAL Colour Recovery from black-and-white ‘telerecordings’ (2008)

techmind.org

82 points by madflame991 3 years ago · 46 comments

Reader

hilbert42 3 years ago

In Australia before the official introduction/launch date of PAL colour television in 1975 it was a requirement of the then, now defunct, broadcasting regulator, the ABCB - Australian Broadcasting Control Board for television stations to remove any colour content from their TV broadcasts. (During the conversion period leading up to the launch stations would run a mixture of B&W and colour material within their stations).

To comply, stations would strip the colour burst from the TV video sync block before it was broadcast. This infuriated many propeller-head techies and nerds, myself included.

To overcome the problem, the 4.43 MHz colour subcarrier in the broadcast video which wasn't deliberately stripped out was used to reconstitute the colour burst. This was achieved by modifying standard PAL colour TV sets (which weren't that difficult to obtain) with the addition of some subcarrier-extracting filters and appropriate phase-locking/modifying circuitry. This was a bit tricky, as the reference phase was no longer there and the fact that it was a PAL signal (PAL - Phase Alternating Line encoding).

In fact, I recall at the station I was working for at the time we had a modified TV set in the engineering department working in colour from off-air signals (one of my colleagues was a past master at tweaking up sets this way).

Perhaps a bit of broadcasting history trivia but it sure shows the colour recovery technique in this story wasn't the first effort.

Edit: Incidentally, the same trick was used on source material such as quadruplex videotape that already had the burst stripped at other locations.

  • dannyw 3 years ago

    What was the benefit / reasoning for removing colour content?

    • maratc 3 years ago

      Can't speak for Australia, but Israel in 1970's considered colour television as "a luxury that would increase social gaps"[0], and a similar colour burst erasing happened. This has lead to a rise of anti-erasing devices that reconstructed the colour burst, returning the colours.

      [0] https://en.wikipedia.org/wiki/Color_killer

      • immibis 3 years ago

        Wow, caring about social gaps. Unthinkable today.

        • CamperBob2 3 years ago

          Unthinkable because government control at that level of granularity is such a stupid and downright offensive idea to begin with, or because it just plain doesn't work?

          • immibis 3 years ago

            Yes, unthinkable because people generally find that the idea of fixing social problems is offensive, and just blame the victims for not fixing the problems they are suffering from.

          • zymhan 3 years ago

            False dichotomy alert

            • CamperBob2 3 years ago

              'Attempt to rationalize totalitarianism' alert.

              • immibis 3 years ago

                "Totalitarianism is when we fix social problems". Thanks for providing an example of people finding it offensive to fix social problems.

                • CamperBob2 3 years ago

                  If you think it's the government's place to prohibit color TV on the grounds that it promotes social inequality, you've largely disqualified any opinions you might have about totalitarianism, social science, or color TV.

      • samson8989 3 years ago

        Israel had a huge balance of trade problem, and color television was an expensive import. So they sold the trade restriction through moral arguments!

    • hilbert42 3 years ago

      The usual bureaucratic crap, the government wanted the kudos - grand opening etc. There was also the legitimate problem that some stations would be ready before others and thus have a leading economic advantage. (Also, there had to be a reasonable supply TV sets available.)

    • bzzzt 3 years ago

      > What was the benefit / reasoning for removing colour content?

      Studio's filming for TV optimized the colors of their sets for contrast on a B&W TV set. Which meant they used ugly colors (cheapest paint available which worked) which were not meant to be reproduced.

      • hilbert42 3 years ago

        Also the chroma (4.43 MHz subcarrier) was visible[1] on B&W sets and could be somewhat annoying, especially so on those with good focus and resolution. Moreover, by the time colour was introduced B&W sets had progressed to having quite reasonable resolution and could easily display content with a resolution of 4.43MHz.

        These 'artifacts' would be made worse when optical herringbone effects, etc. were generated from the mixing of the visible subcarrier with certain types of image content such as striped suits or any content containing closely spaced lines of reasonably high contrast.

        This type of content was always a worry even before the colour/subcarrier issue as it could mix with the scan lines to produce unpleasant optical beating-type effects.

        Some TV stations even had dress code policies to avoid the problem but they were often ignored even before colour was introduced (it's often a bit tricky to tell an important dignitary to change his suit or striped tie before he appears on camera). :-)

        __

        [1] Note: in colour TV sets the chroma subcarrier is intrinsically suppressed by the PAL system (which converts the S/C to colour info) although some artifacts remain because of bandwidth, switching issues and other system limitations.

        It should be remembered early TV systems faced huge engineering obstacles and the 405, 525, CCIR 625 and 819-line system standards were remarkable engineering developments in their own right given the engineering strictures of the time - so colour wasn't on the agenda when they were developed.

        Essentially, colour was an afterthought that had to be retrofitted to and be compatible with these existing B&W television standards.

        The biggest problem was that the colour subsystem had to be fitted within the existing broadcast spectrum, that being the bandwidth of a TV channel which was typically 6, 7 or 8 MHz depending on system or country. And back then this was no easy feat.

        Moreover, squeezing in colour became an even bigger challenge given that bandwidth-limiting techniques were already being employed to reduce spectrum usage, for instance, interlaced scanning and broadcasting video in reduced-bandwidth vestigial sideband.

        Overcoming the bandwidth-limitation challenge posed formidable problems but eventually cleaver minds solved them with some ingenious solutions the first of which was the NTSC Color System. It was followed up with SECAM and PAL which were in essence variants of NTSC and the raison d'être for their development was to overcome NTSC's 'phase' problem wherein colours could easily drift from the original.

        SECAM and PAL had ingenious but quite different ways of 'clamping' down this 'phase-shifting' problem and both were successful at doing so. Leaving national interests/pride aside (SECAM-Fr, PAL-Deu), arguments over which scheme was best revolved around considerations of the technical issues such as bandwidth tradeoff versus amount of phase shift correction that was deemed possible and such. Also included were matters such as the amount of residual artifacts that the colour information would introduce into the main luminance component.

        Now, I'm not going to get into that perennial debate about whether SECAM or PAL is better except to say that my primary television experience was gained from working within a CCIR-625/PAL environment and it's a excellent system. But then so is SECAM excellent, and I can attest to that having spent considerable time in France watching it.

        The US NTSC Color Standard often comes in for criticism but I reckon that's unfair given it was the first. Moreover, it has one significant advantage over the other two and that is its 60 Hz frame rate (the others being 50 Hz). I noticed the difference this makes when years ago I first visited the US: almost my first perception of the country happened at Los Angeles airport when I noticed that the airport's monitors weren't flickering (which is a significant annoyance in 50 Hz systems)!

        A final point: when considering TV encoding systems we cannot forget Nyquist and cohorts who made all that information-squeezing possible, similarly so Shannon whose brilliant ideas have gone into the development of the encoding schemes now used in our modems and digital audio and television systems.

        It seems to me that advances in encoding techniques were essentially just as important as they they were in image sensor development. Both technologies are essential and integral parts of modern digital television, thus developments in both were of critical importance for DTV's development.

        Incidentally, the first colour TV camera I examined close up was a huge and almost unmanageable beast made by RCA. It used three separate vidicons for the chroma channels and an image orthicon for the luminance. Whenever, I look at the camera in my smartphone I never cease to be amazed at the progress we've made in these technologies over the past 50 or so years.

timonoko 3 years ago

Black&White-tv was almost HD, 625x625. Then they added 3Mhz color-carrier in 1966 and it was 300x300 with this color-furze on top. This sucked so much. There was nothing I wanted see in living color. Especially winter-sports were mostly BW.

I remember that color movies sucked also in 1950s. Technicolor has annoying fuzziness around objects. See Wizard of Oz.

  • briantw 3 years ago

    You're confusing the analogue horizontal resolution, with the high bandwidth of up to 5 MHz, with the digital number of vertical lines, which is fixed at 625 per frame, or the low bandwidth of 15 625 lines per second. No amount of chroma noise will change the latter. And as for the former, the analogue horizontal resolution, the higest horizontal resolution you could get, even at 6 MHz bandwidth, would have been 380 analogue lines - nowhere near 625. So, that's 625 lines down, and that's fixed - you couldn't change it. And only about 380 across, and yes, that could be affected by noise, chroma subcarrier, etc.

    And speaking of chroma subcarrier, yes, you will often see crappy chroma fuzz on a black-and-white image, but the reason for that is that the TV set did not have a chroma filter (exactly what they are talking about in this thread). In fact, as I'm answering that, I'm realising that THAT was probably the reason they had to filter out the colour - because it would look crappy on older black-and-white sets that did not have a chroma filter installed. Bingo!

    But I have a couple of circuits from like 30 years ago that converted NTSC or PAL to RGB, and yes, they have the required filter, or you did indeed see the little blockies from NTSC or diagonal fringing on PAL colour transients.

    Interesting discussion!

  • avian 3 years ago

    > Black&White-tv was almost HD, 625x625. Then they added 3Mhz color-carrier in 1966 and it was 300x300

    Maybe adding color did decrease the luma bandwidth and hence the horizontal resolution. I'm not sure about that. I think bw signals just used less bandwidth overall.

    But in no way did color decrease the number of lines in the image. Those are defined by the scanning raster and remained the same in color and bw television.

    • Taniwha 3 years ago

      It's more obvious in US NTSC that the PAL being discussed here .... essentially the colour subcarrier was put way out in the high freq part of the luma signal - display - anything with too high a bandwidth and it stomps on the colour - you've all seen this happen on analog TV ... and it has had profound effects on fashion .... let me explain ...

      So what does "high frequency luma" mean? it means that the brightness of a signal horizontally along a line goes rapidly from dark to bright and back again - if that happens it stomps on the colour sub carrier and the colour goes wonky.

      S-video is just a cable that puts the 2 signals on different wires so this doesn't happen.

      So it turns out that the things that are the worst for this are things like checked or p;laid shirts/ties/dresses, tartans, houndstooth jackets etc etc - Think about what happened to fashion in the 70s/80s as colour TV became ubiquitous, people on TV started wearing solid colours, they didn't want to be the person who's whole body was a crawling mess - and people in the rest of the world started wearing the same sorts of styles - all those 50s/early 60s styles with checks and plaids you see on old game shows, all gone, not because of some big change in fashion - but because they could no longer be represented in popular culture.

      • stuaxo 3 years ago

        I remember going to the US in 1998 and being shocked at how bad NTSC TV looked compared to PAL, the colours just looked wrong.

        • lb1lf 3 years ago

          Back when I used to moonlight in video production, the quip was that NTSC was an acronym for 'Never Twice the Same Colour'.

          The French SECAM system? 'Something Essentially Contrary to the American Method'

          I'll lead myself out.

          • cf100clunk 3 years ago

            SECAM was also jokingly called "Système Électronique pour Confondre les AMéricains"

        • cf100clunk 3 years ago

          For technical reasons inherent in the chosen standard, NTSC TV sets required hue and color knobs, unlike PAL and SECAM. This effectively left it up to the consumer to adjust those values, with no accounting for variances in eyesight or taste. Unfortunately it meant that entire households had to endure the choices of whomever (Dad?) controlled the TV. On visits to others' homes it was painful to see how apallingly bad some peoples' preferences were. With PAL and SECAM the hue and colors were set to a standard, and that was that. Having said all that, the 29.97fps frame rate of NTSC was much easier on the eyes than the flickering 25fps of PAL and SECAM.

          • Aloha 3 years ago

            NTSC adopted the color system it did because the cost of delay lines was considered to be too high, PAL also was more technically complex and probably would have delayed the adoption of Color TV, which was unacceptable to RCA.

            The US oft has this problem, we tend to be early adopters of technology on a wide scale, so by the time a thing comes along that solves most of the inherent problems in the v1, we already have a wide scale implementation of the thing. This happened with TV color, phones (24 Channel T1 vs 32 channel E1 and aLaw/uLaw), credit cards (mag stripes), and all sorts of other things.

            SECAM had some real advantages, but made working with composite signals hard, because of their FM nature. PAL and NTSC are reasonably close conceptually, and frankly so is PAL, you can easily encode PAL into SECAM, because it's mostly the composite signals that are different.

            NTSC was originally 525 lines/60 fields per second (odd/even lines) giving an effective refresh of 30 fps, the 525 lines itself was dictated by our 6MHz channel width, and the field refresh by our 60Hz power. When they added color, they dropped the field refresh down to 59.97 to deal with a beat frequency issue between the color subcarrier and the audio subcarrier.

          • Taniwha 3 years ago

            I'm guessing that watching PAL/SECAM requires slightly slower phosphors that NTSC and you'd see flicker on a modern computer monitor designed for 60-75 HZ ....

        • briantw 3 years ago

          Did you try and adjust the hue? That was a necessary step in getting the colours right over in NTSC land. If you didn't adjust it, yes, you got pictures that were too magenta or too green. That said, I have an NTSC LaserDisc player still plugged in today, and if the hue is appropriately set, the colours are perfectly fine.

        • Aloha 3 years ago

          NTSC and PAL had very very different color gamuts.

          • Taniwha 3 years ago

            Yes - in the US HDTV was a revelation for many people because of the much larger colour gamut, it was just so much better, in Europe is was mostly just bigger

      • avian 3 years ago

        None of this supports the parent's claim that TV went "from black&white 625x625 to color 300x300", which is just wrong on several levels.

        • fortran77 3 years ago

          But there's some truth in this statement!

          The "Luma" resolution is, in theory 625x625, but the "chroma" resolution is approximately 1/4 of that. That's OK, because the way our eyes work.

          So "detail" remains at the 625x625 resolution, but the color information isn't that high. And our brains fill in the rest.

          • tialaramex 3 years ago

            Digital video chroma sub-sampling literally has quarter chroma resolution in 4:2:0 video which is or at least was fairly common for live action stuff. It's obviously not going to be great for recording output from a computer, with sharp coloured edges, but live action scenes look fine.

            I don't think anybody would claim that their 4:2:0 Blu ray has "low resolution" because it used chroma sub-sampling.

        • timonoko 3 years ago

          > None of this supports the parent's claim that TV went "from black&white 625x625 to color 300x300", which is just wrong on several levels.

          Obviously no one here has experienced pure crystal-clear BW-tv and what happened when they turned the color-carrier on. You had to adjust the focus so that horizontal resolution was below 320. And of course the vertical focus was similarly affected, as there was no separate screws for that.

      • mhalle 3 years ago

        Crosstalk between luminance and chrominance signals are called dot crawl (chominance signal interfering with the luminance signal causing spatial artifacts) and chroma crawl (luminance signal stepping on the chrominance signal causing color artifacts).

        I believe chroma crawl was generally more of an issue with NTSC, if the luminance signal wasn't sufficiently band limited.

    • timonoko 3 years ago

      > But in no way did color decrease the number of lines in the image.

      But it did. If you did not want to see the annoying 3Mhz color-carrier on your BW-TV, you had adjust the focus to 300 horizontal lines, which affected the vertical focus too.

  • timonoko 3 years ago

    Had to correct numbers:

    Super-good BW-TV was 625 x 625 x 25 = 10 Mhz. The color-carrier was 4.3 Mhz. So if you did not want to see the color-shit on your BW-TV, you had adjust the focus so that less than 625 x (4.3e6/(625 x 625 x 25)) == 275 horizontal lines were visible. TVs did not had separate adjustement for vertical focus. So all you really had was 270x270 TV.

    Except of course there never was 10Mhz TV-channels. It was below 8 Mhz, which was needed for full color. So there was moment of time, when we could enjoy 8Mhz black and white for a year. Almost 600 horizontal lines. And then they turned the color on and party was over.

    • adrian_b 3 years ago

      Actually in the 625-line TV standard there are only 576 visible lines.

      The other lines are for the vertical retrace, when the video signal is blanked.

      With square pixels, the B&W image would have been 576 x 768, which requires a 7.5 MHz analog video bandwidth (@ 50 Hz vertical & 15625 Hz horizontal frequencies).

      Most 625-line B&W TV sets could display 576 x 768 images very well and some of the early personal computers with video outputs for TV used this format.

      Nevertheless the broadcast TV signal was limited by a low-pass filter to lower horizontal resolutions, corresponding to 5 MHz analog video bandwidth in Western Europe and to 6 MHz analog video bandwidth in Eastern Europe. The reason was to provide space in the TV frequency channel for the audio signal, which used a carrier offset from the video carrier by 5.5 MHz in Western Europe and by 6.5 MHz in Eastern Europe.

      So the broadcast B&W signal was worse than what the B&W TV sets could display, corresponding to 576 vertical pixels by about 510 to 620 horizontal pixels (depending on the country).

  • hilbert42 3 years ago

    "Technicolor has annoying fuzziness around objects. See Wizard of Oz."

    This happens with Technicolor only when it's processed badly and the registration isn't done with sufficient precision. I agree, this has happened from time to time.

    Moreover, you also have to consider where the source material for the Technicolor process originated from. Tri-separated B&W negatives were used in the late 1930s, Wizard of Oz being one and the other major notable Gone With The Wind.

    Prints from tri-separations can be quite excellent, in fact brilliant as the colour can be precisely adjusted. Also colour 'compromises' don't have to be made in the printing as is intrinsically the case with film that use colour couplers - Eastmancolor (Eastman color negative, its internegative and theatre release/print stock) to name just a few.

    (Colour couplers in film emulsions are at best compromises as they have to be compatible with the processing chemistry and many of the best colour dyes and pigments are not. Processes that do not use colour couplers such as Kodachrome and Technicolor are much superior in this regard as stable dyes with the correct (or best) colour can be used. Colour couplers also lower the resolution of an emulsion although in many modern emulsions this isn't a significant problem.)

    Nevertheless, if tri-separated B&W originals are used after being stored a long time then shrinkage differences in the three negatives can pose printing/registration issues.

    It would be interesting to know the source of your Wizard of Oz, - as some years back the DVD version took this into account when the film was remastered. Every frame of the tri-separated B&W printing masters was resized to ensure its geometry was identical to all others. I've seen that remastered copy and its registration is excellent.

    Incidentally, the very last version of the Technicolor processes of the 1950s was the best colour film system for movies ever devised before they went digital. However, one needs to bear in mind that many so-called Technicolor films are only hybrids, as they use Eastmancolor (or other) film stock for both the original source and for later dupes from earlier Technicolor theatre release prints. They, along with multigeneration copies, often create many issues including low (fuzzy) resolution and muddy cross-colour effects.

    When making a claim like you have it's imperative you first check a film's manufacturing/printing methods. Tracing its manufacturing provenance is absolutely essential.

    Edit: FYI, pre-WWII B&W film emulsions as used in the Wizard of Oz were never as grain-free or as sharp as modern-day equivalents are. You also need to ensure that you aren't drawing any comparison to these much newer products. The Technicolor process should not be blamed for limitations in the source material.

londons_explore 3 years ago

This is a perfect use for image to image ML models...

Throughout one recording, the phase shift caused by the distortion of the glass screen is probably approximately the same - and therefore could be learned.

Then for the actual decoding, certain elements of the frame should be of approximately known colours - for example someone's face should be skin colour. That then informs the colours for neighbouring objects, since over a small area phase is consistent.

Applying such techniques repeatedly over the whole video, trying to minimize inconsistencies, I'd bet you can get really good results.

Jaruzel 3 years ago

This was used to good effect to 'recolour' the black and white versions of old Dr Who episodes due to the colour originals having been lost/destroyed.

powlow 3 years ago

This is from 2008 - what are the newer developments in this space?

dannyw 3 years ago

A great video series on analog TV and color:

https://youtu.be/dX649lnKAU0

https://youtu.be/InrDRGTPqnE

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection