buy Sildenafil without prescription Zyban without prescription Buy Propranolol Buy viagra super active Aciphex Bactroban no prescription Buy Citalopram buy Pepcid online buy Bactroban Lasix without prescription buy Premarin online Soma

AN004 Application Note

Common Video Standards
?? J. Styer, Sep 28, 2001

While there are different standards of video such as NTSC and PAL, there are also different signaling formats as well. This note will cover the most common ones used today such as composite, S-Video, and RGB. YUV and similar types will also be discussed in brief.

Up until the mid 1980s, the composite video signal was the most commonly used method of transferring video, with some high end equipment using separate Red, Green and Blue (RGB) signals. The composite video signal was designed primarily for broadcast television and not ideally suited for industrial and higher performance applications. A primary problem with composite video is that it suffers from a phenomenon called “cross color”. The effect of this is that sharp video edges produce a jagged, zipper like crawling. This problem only increases as video sources, such as newer cameras, produce higher resolution signals.

This problem is not as bad with broadcast TV signals since the bandwidth is limited to about 330 lines of horizontal resolution. Industrial and professional applications, can have horizontal resolutions exceeding 450 lines, and up to more than 800 lines in some applications. This high resolution causes sharp luminance edges which bleed into the color decoders causing unwanted artifacts, usually noticed as a “crawling” through the image

A second problem also occurs with composite video. Since the color (chroma) information and the luminance (brightness) information run through the same wire, the notch filter (trap) that is used to remove the chroma information also removes some of the luminance information as well. This results in less picture detail and a nonlinear luminance response. Clearly, something better was needed.

The first answer was to use separate red, green and blue (RGB) signals, but this was a fairly expensive solution back in the early 1980s. A few other systems appeared for broadcast applications but none really caught on outside of their intended area. At the same time that industrial applications needed a better solution, one was being sought for home use as well. Ironically, the answer came from an unexpected source at the time, the (then small) computer industry.

The answer was simple, separate the color (c) and luminance (y) information and carry them through separate wires, a system that became known as YC and later S-Video. This made it reasonably easy to integrate into existing equipment and at the same time provide a distinct increase in picture quality. Since the color and luminance were carried in separate cables, it had the potential to eliminate both the cross color problem and the trap problem. While most equipment takes full advantage of the signal separation, not all equipment properly implements the two separate channels. Some equipment still has the trap in place since it is also used for composite video. S-Video has also become a popular standard for home use as well, especially with DVD players.

While S-Video provided a substantial increase in image quality, it still had the limitations of an encoded video signal. This meant that the luminance (brightness) bandwidth was much higher than the color bandwidth. While the luminance could change transitions quickly, the color needed time to catch up. RGB was the obvious answer since it has equal bandwidth on all three channels. RGB is still the reigning king for best resolution and quality of image if used to it’s full capabilities. However, RGB needs at least three high quality cables, four if the separate sync signal is used, which is quite common. In some high end applications (such as workstations) five wires are used, with separate horizontal and vertical drive signals in place of the sync line. Contrast this to the single cable of the composite signal or the dual cable of the S-Video signal. Equipment to process RGB also needs more channels to deal with the separate red, green and blue color signals and either of two sync methods (external or sync on green). All channels in RGB must be equal (high) bandwidth. This is part of why some RGB equipment costs more.

Another standard which has grown and faded in popularity several times is YUV and it’s derivatives. When an RGB source is encoded into composite or S-Video, YUV is the intermediate signal used in the processing. It still has a limitation on the color bandwidth but it is far superior to the encoded composite or S-Video signals. Luminance is also full bandwidth with no trap applied. Other variations have been know as Y/R-Y/B-Y, M-II and the more recent consumer standard of Y/Pb/Pr. Y/Pb/Pr may catch on as an industry standard since a significant number of new chips support it. All three derivatives are very similar with only some level and other minor differences.