Wednesday, April 27, 2011

NTSC demystified - Nuances and Numbers - Part 3

Ever since I understood how NTSC color worked, there was nagging question. How does the TV separate color info (color sub-carier) from the luminance information. If the same color was maintained all throughout the line, then the color carrier can be removed away by narrow band-reject filtering. But when the color changes ( phase changes ) , then how does the TV distinguish color from luminance which is also possibly changing ?



In fact, a TV without a comb filter(we'll get to this later, for now, we'll think of an old CRT TV) doesn't do a good job of separating luma from the chroma [1][2]. Electrically, the signal is flooded with interference since luma and chroma are superposed[1]. The NTSC team chose the line duration, color carrier cycle frequency, the number of lines per frame very very carefully so as to minimize the visual effect of such an interference. In simple terms, the NTSC numbers are chosen in such a way that constructive and destructive interference occurs around the same point both in time and space so that the eye "averages" out these effects and thus masks the defects.

We shall understand this interference pictorially. Square waves are used for illustrative purposes ( and digitally, we'll be generating square waves anyway ) although sine waves are most often used. This time we'll be sending just ONE line's worth info to the TV and examine what is drawn. This is unlike the previous illustrations ( part1 and part2) where one line was repeatedly sent.
 
There is a white "pixel" when color changes. Mathematically, color is sent as a DSB-SC modulated wave and when color changes ( phase changes ) rapidly, the spectrum of the chroma signal will have enough low frequency components that the TV interprets it as Luma. This is made more obvious in the illustration by using square waves. If we examine the small section of the signal around the color change, its indistinguishable from a luma change ! And greater the phase change, worse is the defect produced. Now, lets see 2 consecutive lines both of which are trying to produce identical colors on the TV screen.
There are a few things that pop up immediately
1. Color Burst is changing phase line to line ! The first thought that came to me was that something must be wrong as this phase change is only in PAL. Noting carefully the color carrier maintained by the TV , we see that even though there is an apparent change in the phase of the color burst, there is no actual phase change in the color carreir. The reason for this apparent change is that the duration of each scanline is precisely 227.5 times the length of each sub-carrier cycle.
2. Since all colors are described by the phase of the chroma signal taking phase of the color burst as reference, even though the chroma signal is producing the same colors, it is inverted compared to the chroma in the first line.
3. An important result of the above two observations is that the interference between luma and chroma, if constructive in one line, is essentially destructive in the second line. A full screen will look like, 

The NTSC signal tries to cancel this defect spatially. Since each frame consists of 525 lines, this defect cancellation happens temporally too.

The same image animated.



Click on the two figures above to view the animation. The dot crawl is a side-effect. This is not usually visible. Vertical columns of color tend to expose this.

One problem that can happen is in "fake-progressive" scan which gives out 262 lines per frame. With even number of lines per frame, temporal interference cancellation will not happen. The alternating black/white strips will remain still. One simple way of overcoming this is to give 1 line lasting 228 cycles of color carrier and 261 lines lasting 227.5 cycles of the color carrier. That 1 line might be so placed that it is not visible. This way, temporal cancellation can be done even in "fake-progressive" scanning mode.

If each line happened to be an integer times ( say 228 ) the period of color carrier, then for every color change, we would have a dark or bright strip. Atari 2600 , Apple 2 used to generate color this way. Thats why their graphics is inferior compared to Commodore 64 / NES and also have greater trouble syncing to newer TV's.

Now lets see why each line needs to be 227.5 and not 220.5 and why color carrier was chosen as 3.57954545Mhz.
NTSC is a compatible standard. So, there was a B&W standard before. In this B&W standard,
Line frequency was 15750Hz which is equivalent to a duration of 63.492 uS. A color carrier which will be super posed on the existing B&W signal has to be of sufficiently high frequency that the combined signal looks OK on older B&W TV. Choosing too high a color carrier will leave very little color bandwidth. The maximum bandwidth of B&W signal was about 4.2 MHz. As a compromise, a color carrier of 3.6 MHz could have been chosen. This gives 0.6Mhz chroma bandwidth while being sufficiently high that B&W TV's are OK with this. However, the results wont be good if the sound carrier at 4.5MHz interfered with the 3.6MHz color carrier ( the resulting 900kHz signal from interference will become visible). Going down a notch, selecting 3.579545Mhz hits the spot ( 920.455 kHz which is very close to being a half multiple and as we've seen , this reduces interference dramatically ). A more complete treatment of this is presented in [1]..455 kHz

Finally, comb filters can help reduce interference greatly. Vertical luma resolution is sacrificed to remove interference. The waveform of each line is added to that of the previous line. This can effectively separate luma and chroma if two successive lines dont differ too much.
This is also done temporally. For this, the TV first ensure that it is displaying a static image ( this is useless for moving pictures ), then "adds" two successive frames to separate luma from chroma. Because of the phase difference, such additions result in elimination of the chroma signal. With the two separated, the TV can draw without artifacts.
Older CRT TV's dont have comb filters and expect the eye to do the job being done by comb filters in newer TV's.

Comb filters essentially assume that each line is 227.5 cycles long. If this is not true, comb filters suffer badly and this is why Atari 2600 and Apple II look worse on newer TV's.

References
1. PAL SECAM NTSC Conversion
2. Dot Crawl - Wikipedia

No comments:

Post a Comment