Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Colorburst
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Rationale for NTSC Color burst frequency == {{Details|NTSC#Color encoding}} The original black and white NTSC television standard specified a frame rate of 30 Hz and 525 lines per frame, or 15750 lines per second. The audio was frequency modulated 4.5 MHz above the video signal. Because this was black and white, the video consisted only of luminance (brightness) information. Although all of the space in between was occupied, the line-based nature of the video information meant that the luminance data was not spread uniformly across the [[frequency domain]]; it was concentrated at multiples of the line rate. Plotting the video signal on a [[spectrogram]] gave a signature that looked like the teeth of a comb or a gear, rather than smooth and uniform. [[RCA]] discovered<ref>{{Cite journal|last=Brown and Luck|date=June 1953|title=PRINCIPLES AND DEVELOPMENT OF COLOR TELEVISION SYSTEMS|url=http://www.americanradiohistory.com/ARCHIVE-RCA/RCA-Review/RCA-Review-1953-June.pdf|journal=RCA Review|volume=XIV|pages=155β156}}</ref> that if the [[chrominance]] (color) information, which had a similar spectrum, was modulated on a carrier that was a [[half-integer]] multiple of the line rate, its signal peaks would fit neatly between the peaks of the luminance data and interference was minimized. It was not eliminated, but what remained was not readily apparent to human eyes. (Modern televisions attempt to reduce this interference further using a [[comb filter]].) To provide sufficient bandwidth for the chrominance signal, yet interfere only with the highest-frequency (and thus least perceptible) portions of the luminance signal, a chrominance subcarrier near 3.6 MHz was desirable. 227.5 = 455/2 times the line rate was close to the right number, and 455's small factors (5 Γ 7 Γ 13) make a divider easy to construct. However, additional interference could come from the [[audio signal]]. To minimize interference there, it was similarly desirable to make the distance between the chrominance [[carrier frequency]] and the audio carrier frequency a half-integer multiple of the line rate. The sum of these two half-integers implies that the distance between the frequency of the luminance carrier and audio carrier must be an integer multiple of the line rate. However, the original NTSC standard, with a 4.5 MHz carrier spacing and a 15750 Hz line rate, did not meet this requirement: the audio was 285.714 times the line rate. While existing black and white receivers could not decode a signal with a different audio carrier frequency, they could easily use the copious timing information in the video signal to decode a slightly slower line rate. Thus, the new color television standard reduced the line rate by a factor of 1.001 to 1/286 of the 4.5 MHz audio subcarrier frequency, or about 15734.2657 Hz. This reduced the frame rate to 30/1.001 β 29.9700 Hz, and placed the color subcarrier at 227.5/286 = 455/572 = 35/44 of the 4.5 MHz audio subcarrier.<ref>{{Cite web|url=https://www.antiqueradio.org/art/NTSC%20Signal%20Specifications.pdf|title=NTSC SIGNAL SPECIFICATIONS|date=23 May 2018|website=Antique Radio.org}}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)