This article may be too technical for most readers to understand.(December 2025) |
This article's factual accuracy may be compromised due to out-of-date information.(October 2025) |
NTSC (an acronym of National Television System Committee) was the first American standard for analog television, published and adopted in 1941. [1] It was one of three major color formats for analog television; the others were PAL and SECAM. NTSC color was usually associated with System M, and this combination was sometimes called NTSC II. [2] [3] A second NTSC standard was adopted in 1953, [4] which allowed color television compatible with the existing stock of black-and-white sets. [5] [6] [7] The EIA defined NTSC performance standards in EIS-170 (also known as RS-170) in 1957. [8]
The term "NTSC" has referred to digital formats with 480–487 active lines and a 30 or 29.97 FPS frame rate since the introduction of digital sources such as DVDs, and is a digital shorthand for System M. The NTSC-Film standard has a digital resolution of 720 × 480 pixels for DVD-Videos, 480 × 480 pixels for Super Video CDs (SVCD, aspect ratio 4:3) and 352 × 240 pixels for Video CDs (VCD). [9] The digital video (DV)-camcorder format equivalent of NTSC is 720 × 480 pixels. [10] The digital television (DTV) equivalent is 704 × 480 pixels. [10]
The NTSC was established in 1940 by the United States Federal Communications Commission (FCC) to resolve conflicts between companies about the introduction of a nationwide analog television system. In March 1941, the committee issued a technical standard for black-and-white television based on a 1936 recommendation by the Radio Manufacturers Association (RMA). Technical advancements of the vestigial sideband technique provided an opportunity to increase image resolution. The NTSC selected 525 scan lines as a compromise between RCA's 441-scan line standard (used by RCA's NBC TV network) and Philco and DuMont's desire to increase the number of scan lines to between 605 and 800. The standard recommended a frame rate of 30 FPS, consisting of two interlaced fields per frame at 262.5 lines per field and 60 fields per second. Other standards in the final recommendation were an aspect ratio of 4:3 and frequency modulation (FM) of the sound signal.
In January 1950, the committee was reconstituted to standardize color television. The FCC had briefly approved a 405-line field-sequential color TV standard, developed by CBS, in October 1950. [11] The CBS system was incompatible with existing black-and-white sets. It used a rotating color wheel, reduced the number of scan lines from 525 to 405, and increased the field rate from 60 to 144 with an effective frame rate of 24 FPS. Legal action by rival RCA kept commercial use of the system off the air until June 1951, and regular broadcasts only lasted a few months before the manufacture of all color sets was banned by the Office of Defense Mobilization in October (ostensibly due to the Korean War). [12] [13] [14] [15] A variant of the CBS system was later used by NASA to broadcast pictures of astronauts in space. [16] CBS rescinded its system in March 1953, [17] and the FCC replaced it on December 17 of that year with an NTSC color standard developed by several companies (including RCA and Philco). [18]
In December 1953, the FCC unanimously approved what became the NTSC color-television standard (later defined as RS-170a). The standard retained backward compatibility with existing black-and-white sets. Color information was added to the black-and-white image by introducing a color subcarrier of 315 divided by 88 MHz (3.579545 MHz ± 10 Hz). [19] This frequency was chosen so horizontal line-rate modulation components of the chrominance signal fall between the horizontal line-rate modulation components of the luminance signal; the chrominance signal could be easily filtered out of the luminance signal on new sets, and would be minimally visible on existing sets. Due to limitations of frequency divider circuits when the color standard was promulgated, the color subcarrier frequency was constructed as a composite frequency assembled from small integers –in this case, 5 × 7 divided by 8 × 11 MHz. The horizontal line rate was reduced to 15,734 lines per second (3.579545 × 2 divided by 455 MHz = 9⁄572 MHz) from 15,750 LPS, and the frame rate was reduced to 30/1.001 ≈ 29.970 FPS (the horizontal line rate divided by 525 lines/frame) from 30 FPS. The changes amounted to 0.1 percent, and were tolerated by existing TV sets. [20] [21]
The first publicly-announced network television broadcast of a program using the NTSC "compatible color" system was an episode of NBC's Kukla, Fran and Ollie on August 30, 1953, viewable in color only at NBC headquarters. [22] The first nationwide viewing of NTSC color was on the following January 1 with the coast-to-coast broadcast of the Tournament of Roses Parade, viewable on prototype color receivers at special presentations nationwide. The first color NTSC television camera was the RCA TK-40, used for experimental broadcasts in 1953; an improved version, the TK-40A (introduced in March 1954), was the first commercially-available color-television camera. Later that year, an improved TK-41 became the standard camera and was used through much of the 1960s. The NTSC standard was adopted by other countries, including Japan and several in the Americas.
With the advent of digital television, analog broadcasts were largely phased out. NTSC broadcasters in the U.S. were required by the FCC to shut down their analog transmitters by February 17, 2009; the shutdown was later moved to June 12 of that year. Low-power and Class A stations and translators were required to shut down by 2015, although an FCC extension allowed some stations operating on Channel 6 to operate until July 13, 2021. [23] Canadian analog TV transmitters in markets not subject to the mandatory 2011 transition were to be shut down by January 14, 2022, under a 2017 schedule from Innovation, Science and Economic Development Canada. [24]
Most countries using the NTSC standard and those using other analog television standards have switched (or are switching) to newer digital television standards; at least four different standards are in use worldwide. North America, parts of Central America, and South Korea are adopting (or have adopted) the ATSC standards; other countries, such as Japan, are adopting (or have adopted) standards other than ATSC. Most over-the-air NTSC transmissions in the United States ended on June 12, 2009, [25] and by August 31, 2011, [26] in Canada and most other NTSC markets. [27]
Colorimetry refers to the colorimetric characteristics of the system and its components, including the primary colors used, the camera, and the display. NTSC color had two distinctly-defined colorimetries, shown on the chromaticity diagram as NTSC 1953 and SMPTE C. Manufacturers introduced a number of variations for technical, economic, marketing, and other reasons. [28]
| Color space | Standard | Year | White point | CCT | Primary colors (CIE 1931 xy) | Display gamma EOTF | ||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| x | y | k | Rx | Ry | Gx | Gy | Bx | By | ||||
| NTSC | ITU-R BT.470/601 (M) | 1953 | 0.310 | 0.316 | 6774 (C) | 0.67 | 0.33 | 0.21 | 0.71 | 0.14 | 0.08 | 2.2 |
| SMPTE C | SMPTE RP 145 (C), 170M, 240M | 1987 | 0.3127 | 0.329 | 6500 (D65) | 0.63 | 0.34 | 0.31 | 0.595 | 0.155 | 0.07 | |
Note: displayed colors are approximate and require a wide gamut display for faithful reproduction.
The original 1953 color NTSC specification, still part of the United States Code of Federal Regulations, defined the colorimetric values of the system as shown in the table. [29] Early color-television receivers, such as the RCA CT-100, were faithful to this specification (based on prevailing motion-picture standards) which had a larger gamut than most present-day monitors. Their low-efficiency phosphors (notably in red) were weak and persistent, leaving trails after moving objects. Beginning in the late 1950s, picture-tube phosphors sacrificed saturation for increased brightness; this deviation from the standard at receiver and broadcaster was the source of considerable color variation.
To ensure more uniform color reproduction, some manufacturers incorporated color-correction circuits into sets which converted the received signal—encoded for colorimetric values—and adjusted the monitor's phosphor characteristics. Since color cannot be accurately corrected on the nonlinear transmitted gamma corrected signals, the adjustment can only be approximated.
At the broadcaster stage, in 1968–69 the Conrac Corporation (working with RCA) defined a set of controlled phosphors for use in broadcast color video monitors. [30] This specification survives as the SMPTE C phosphor specification. [31] As with home receivers, it was recommended [32] that studio monitors incorporate similar color-correction circuits so broadcasters would transmit pictures encoded for the original 1953 colorimetric values in accordance with FCC standards.
In 1987, the Society of Motion Picture and Television Engineers (SMPTE) Committee on Television Technology Working Group on Studio Monitor Colorimetry adopted the SMPTE C (Conrac) phosphors for general use in Recommended Practice 145; [33] this prompted many manufacturers to modify their camera designs to encode for SMPTE C colorimetry without color correction [34] as approved in SMPTE standard 170M, "Composite Analog Video Signal – NTSC for Studio Applications" (1994). The ATSC digital television standard states that for 480i signals, SMPTE C colorimetry should be assumed unless colorimetric data is included in the transport stream. [35]
The Japanese NTSC never changed primaries and white point to SMPTE C, continuing to use the 1953 NTSC primaries and white point. [32] The PAL and SECAM systems used the original 1953 NTSC colorimetry until 1970; [32] unlike NTSC, the European Broadcasting Union (EBU) rejected color correction in receivers and studio monitors and called for all equipment to encode signals for EBU colorimetric values. [36]
In the gamuts on the CIE chromaticity diagram, variations among colorimetries can result in visual differences. Proper viewing requires gamut mapping via LUTs or additional color grading. SMPTE Recommended Practice RP 167-1995 refers to such an automatic correction as an "NTSC corrective display matrix." [37] Material prepared for 1953 NTSC may look de-saturated when displayed on SMPTE C or ATSC/BT.709 displays, and may have noticeable hue shifts. SMPTE C materials may appear slightly more saturated on BT.709/sRGB displays, or significantly more saturated on P3 displays, if appropriate gamut mapping is not done.
This section needs additional citations for verification .(February 2024) |
NTSC uses a luminance-chrominance encoding system. Using a separate luminance signal maintained backward compatibility with contemporary black-and-white television sets; only color sets would recognize the chroma signal.
The red, green, and blue primary color signals are weighted and summed into a single luma signal, designated (Y prime), [38] which replaces the original monochrome signal. The color-difference information is encoded into the chrominance signal, which carries only the color information. This allows black-and-white receivers to display NTSC color signals by ignoring the chrominance signal. Some black-and-white TVs sold in the U.S. after the introduction of color broadcasting in 1953 were designed to filter chroma out, but early sets did not do this and chrominance could be seen as a crawling dot pattern in areas of the picture with saturated colors. [39]
To derive separate signals with only color information, the difference is determined between each color primary and the summed luma; the red difference signal is , and the blue difference signal is . These difference signals are used to derive two new color signals, known as (in-phase) and (in quadrature), in a process known as QAM. The color space is rotated relative to the difference-signal color space; orange-blue color information (which the human eye is most sensitive to) is transmitted on the signal at 1.3 MHz bandwidth, and the signal encodes purple-green color information at 0.4 MHz bandwidth. This allows the chrominance signal to use less overall bandwidth without noticeable color degradation. The two signals each amplitude modulate [40] 3.58 MHz carriers, which are 90 degrees out of phase with each other, [41] and the result is the sum with the carriers suppressed. [42] [40] The result can be viewed as a single sine wave, with varying phase relative to a reference carrier and with varying amplitude. The varying phase represents the instantaneous color hue captured by a TV camera, and the amplitude represents the color saturation. The 315⁄88 MHz subcarrier is added to the luminance to form the composite color signal, [40] which modulates the video-signal carrier. [43]
For a color TV to recover hue information from the color subcarrier, it must have a zero-phase reference to replace the previously-suppressed carrier. The NTSC signal includes a short sample of this reference signal, known as the colorburst, located on the back of each horizontal synchronization pulse. The colorburst consists of at least eight cycles of the unmodulated color subcarrier. The TV receiver has a local oscillator, which is synchronized with these color bursts to create a reference signal. Combining the reference phase signal with the chrominance signal allows the recovery of the and signals, which (with the signal) is reconstructed to the individual signals which are sent to the CRT to form the image.
In CRT televisions, the NTSC signal is turned into three color signals: red, green, and blue; each controls an electron gun designed to excite only the corresponding red, green, or blue phosphor dots. TV sets with digital circuitry use sampling techniques to process the signals, with identical results. For analog and digital sets processing an analog NTSC signal, the original three color signals are transmitted with three discrete signals (Y, I and Q), recovered as three separate colors (R, G, and B), and presented as a color image.
When a transmitter broadcasts an NTSC signal, it amplitude-modulates a radio-frequency carrier with the NTSC signal and frequency-modulates a carrier 4.5 MHz higher with the audio signal. With non-linear distortion of the broadcast signal, the 315⁄88 MHz color carrier may beat with the sound carrier to produce a dot pattern on the screen.
A transmitted NTSC television channel has a total bandwidth of 6 MHz. The actual video signal, which is amplitude-modulated, is transmitted between 500 kHz and 5.45 MHz above the lower end of the channel. The video carrier is 1.25 MHz above the lower end of the channel. Like most AM signals, the video carrier generates two sidebands: one above the carrier and one below. Each sideband is 4.2 MHz wide. The upper sideband is transmitted, but only 1.25 MHz of the lower sideband (known as a vestigial sideband) is transmitted. The color subcarrier, 3.579545 MHz above the video carrier, is quadrature-amplitude-modulated with a suppressed carrier. The audio signal is frequency modulated with a 25-kHz maximum frequency deviation, less than the 75 kHz deviation on the FM band. The main audio carrier is 4.5 MHz above the video carrier, 250 kHz below the top of the channel. Sometimes a channel may contain an MTS signal, which offers more than one audio signal by adding one or two subcarriers to the audio signal; this is normally the case when stereo audio or second audio program signals are used. The same extensions are used in ATSC, whose digital carrier is 0.31 MHz above the low end of the channel.
| Color | Luminance level (IRE) | Chrominance level (IRE) | Chrominance amplitude (IRE) | Phase (º) |
|---|---|---|---|---|
| White | 100.0 | 0.0 | 0.0 | – |
| Yellow | 89.5 | 48.1 to 130.8 | 82.7 | 167.1 |
| Cyan | 72.3 | 13.9 to 130.8 | 116.9 | 283.5 |
| Green | 61.8 | 7.2 to 116.4 | 109.2 | 240.7 |
| Magenta | 45.7 | −8.9 to 100.3 | 109.2 | 60.7 |
| Red | 35.2 | −23.3 to 93.6 | 116.9 | 103.5 |
| Blue | 18.0 | −23.3 to 59.4 | 82.7 | 347.1 |
| Black | 7.5 | 0.0 | 0.0 | – |
Film has a frame rate of 24 frames per second, and the NTSC standard is approximately 29.97 (10 MHz × 63/88/455/525) FPS. In regions with 25-FPS television and video standards, this difference can be overcome by speed-up. For 30-FPS standards, 3:2 pulldown is used. One film frame is transmitted for three video fields (lasting 1+1⁄2 video frames), and the next frame is transmitted for two video fields (lasting 1 video frame). Two film frames are transmitted in five video fields, for an average of 2+1⁄2 video fields per film frame. The average frame rate is 60 ÷ 2.5 = 24 frames per second.
Film shot specifically for NTSC television usually has a speed of 30 frames per second to avoid 3:2 pulldown. [45] To show 25-FPS material (such as European television series and some European films) on NTSC equipment, every fifth frame is duplicated and the resulting stream is interlaced.
Film shot for NTSC television at 24 frames per second has traditionally been accelerated by 1/24 (to about 104.17% of normal speed) for transmission in regions with 25-FPS television standards. This increase in picture speed has traditionally been accompanied by a similar increase in audio pitch and tempo. Frame-blending is used to convert 24 FPS video to 25 FPS without altering its speed.
Film shot for television in regions with 25-FPS television standards can be handled in one of two ways:
An NTSC frame has two fields: F1 and F2. Field dominance depends on a combination of factors, including decisions by equipment manufacturers and historical conventions. Most professional equipment can switch between a dominant upper or dominant lower field. [25] [46] Field dominance is important when editing NTSC video; incorrect interpretation of field order can cause a shuddering effect as moving objects jump forward and behind on each successive field. This is important when interlaced NTSC is transcoded to a format with a different field dominance. Field order is important when transcoding progressive video to interlaced NTSC to prevent flash fields in the interlaced video if the field dominance is incorrect. Three-two pulldown, converting 24 FPS to 30, will also provide unacceptable results with an incorrect field order.
Unlike PAL and SECAM, in use throughout the world, NTSC color encoding is almost invariably used with CCIR System M.
NTSC-N was originally proposed in the 1960s to the CCIR as a 50-Hz broadcast method for the System N countries Paraguay, Uruguay, and Argentina before they chose PAL. In 1978, with the introduction of Apple II Europlus, it was reintroduced as NTSC 50: a system combining 625-line video with 3.58-MHz NTSC color. An Atari ST running PAL software on its NTSC color display used this system, since the monitor could not decode PAL color. Most analog NTSC television sets and monitors with a vertical-hold can display this system after adjusting the vertical hold. [47]
NTSC 4.43 transmits an NTSC color subcarrier of 4.43 MHz instead of 3.58 MHz. [48] The output is only viewable by TVs which support the system, such as most PAL sets. [49]
In January 1960, seven years before adoption of the modified SECAM version, the experimental TV studio in Moscow began broadcasting with the OSKM system. OSKM was the version of NTSC adapted to the European D/K 625/50 standard. OSKM is an acronym for "Simultaneous system with quadrature modulation" (Russian:Одновременная Система с Квадратурной Модуляцией). It used the color-coding scheme later used in PAL (U and V, instead of I and Q).
The color subcarrier frequency was 4.4296875 MHz, and the bandwidth of U and V signals was near 1.5 MHz. [50] About 4,000 TV sets in four models (Raduga, [51] Temp-22, Izumrud-201 and Izumrud-203) [52] were produced, and were not commercially available.
This section relies largely or entirely upon a single source .(December 2025) |
In NTSC (and, to a lesser extent, PAL), reception problems can degrade the color accuracy of the picture; ghosting can change the phase of the colorburst, altering a signal's color balance. The vacuum-tube electronics used in televisions through the 1960s led to technical problems, which is why NTSC televisions were equipped with a tint control. Hue controls are still found on NTSC TVs, but color drifting generally ceased to be a problem by the 1970s. Compared to PAL in particular, NTSC color accuracy and consistency were sometimes considered inferior; video professionals and television engineers jokingly referred to NTSC as Never The Same Color, Never Twice the Same Color, or No True Skin Colors. [53]
This section needs to be updated.(October 2025) |
A standard NTSC video image contains invisible lines (lines 1–21 of each field) known as the vertical blanking interval, or VBI); lines 1–9 are used for the vertical-sync and equalizing pulses. The remaining lines were blanked in the original NTSC specification to provide time for the electron beam on CRT screens to return to the top of the display.
VIR (vertical interval reference), adopted during the 1980s, attempts to correct some NTSC color problems by adding studio-inserted reference data for luminance and chrominance levels on line 19. [54] Suitably-equipped television sets could then use the data to adjust the display for a closer match to the original studio image. The VIR signal has three sections; the first has 70 percent luminance and the same chrominance as the colorburst signal and the other two have 50 percent and 7.5 percent luminance, respectively. [55]
The remaining VBI lines are typically used for datacasting or ancillary data such as video editing timestamps (vertical interval timecodes or SMPTE timecodes on lines 12–14), [56] [57] test data on lines 17–18, a network source code on line 20 and closed captioning, XDS, and V-chip data on line 21. Early teletext applications also used VBI lines 14–18 and 20, but teletext with NTSC was never widely adopted. [58]
Some stations transmitted TV Guide On Screen data (an electronic program guide) on VBI lines 11–18, 20, and 22. The primary station in a market (often a local PBS station) broadcast four lines, and backup stations transmitted one. TVGOS was discontinued in 2013 and 2016, ending OTA program-guide services for compatible devices. [59] [60]
Parts of this article (those related to individual sections) need to be updated.(December 2014) |
| Country | Switched to | Switchover completed |
|---|---|---|
| ATSC | December 4, 2024 | |
| DVB-T | March 1, 2016 | |
| ATSC | August 31, 2011 (select markets) [f] | |
| ISDB-Tb | April 9, 2024 [78] | |
| ISDB-Tb | August 15, 2019 | |
| ATSC | December 15, 2021 [g] | |
| ISDB-Tb | December 1, 2024 | |
| ISDB-Tb | December 31, 2024 [h] | |
| ISDB-Tb | December 31, 2019 [i] | |
| ISDB-T | March 31, 2012 [j] | |
| ATSC | December 31, 2015 (full-power stations) [k] [85] | |
| ATSC | March 5, 2024 | |
| ISDB-Tb | December 31, 2024 [l] | |
| ATSC | December 31, 2012 | |
| ATSC | June 17, 2015 [87] | |
| DVB-T | June 30, 2012 | |
| ATSC | June 12, 2009 (full-power stations) [88] [89] September 1, 2015 (class A stations) [90] July 13, 2021 (Low power stations) [91] [92] |
{{cite web}}: Missing or empty |url= (help){{citation}}: CS1 maint: numeric names: authors list (link){{cite report}}: CS1 maint: numeric names: authors list (link)The NTSC corrective matrix in a display device is intended to correct any colorimetric errors introduced by the differ- ence between the camera primaries and the display tube phosphors.
Most modern TV-sets accept the so called pseudo formats (Pseudo PAL and Pseudo NTSC)...
On July 2 the Government of Mexico formally adopted the ATSC Digital Television (DTV) Standard for digital terrestrial television broadcasting.