Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
MPEG-1
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Part 1: Systems== Part 1 of the MPEG-1 standard covers ''systems'', and is defined in ISO/IEC-11172-1. MPEG-1 Systems specifies the logical layout and methods used to store the encoded audio, video, and other data into a standard bitstream, and to maintain synchronization between the different contents. This [[file format]] is specifically designed for storage on media, and transmission over [[communication channel]]s, that are considered relatively reliable. Only limited error protection is defined by the standard, and small errors in the bitstream may cause noticeable defects. This structure was later named an [[MPEG program stream]]: "The MPEG-1 Systems design is essentially identical to the MPEG-2 Program Stream structure."<ref name=mpeg1_systems>{{Citation |first=Leonardo |last=Chiariglione |title=MPEG-1 Systems |publisher=[[International Organization for Standardization|ISO]]/[[International Electrotechnical Commission|IEC]] |url=http://mpeg.chiariglione.org/faq/mp1-sys/mp1-sys.htm |access-date=2016-11-11 |url-status=live |archive-url=https://web.archive.org/web/20161112204902/http://mpeg.chiariglione.org/faq/mp1-sys/mp1-sys.htm |archive-date=2016-11-12 }}</ref> This terminology is more popular, precise (differentiates it from an [[MPEG transport stream]]) and will be used here. ===Elementary streams, packets, and clock references=== *Elementary Streams (ES) are the raw bitstreams of MPEG-1 audio and video encoded data (output from an encoder). These files can be distributed on their own, such as is the case with MP3 files. *Packetized Elementary Streams (PES) are elementary streams [[Packet (information technology)|packet]]ized into packets of variable lengths, i.e., divided ES into independent chunks where [[cyclic redundancy check]] (CRC) [[checksum]] was added to each packet for error detection. *System Clock Reference (SCR) is a timing value stored in a 33-bit header of each PES, at a frequency/precision of 90 kHz, with an extra 9-bit extension that stores additional timing data with a precision of 27 MHz.<ref name=pack_header>{{Citation |title=Pack Header |url=http://dvd.sourceforge.net/dvdinfo/packhdr.html |access-date=2016-11-11 |url-status=live |archive-url=https://web.archive.org/web/20161027171334/http://dvd.sourceforge.net/dvdinfo/packhdr.html |archive-date=2016-10-27 }}</ref><ref name=tutorial_stc>{{Citation |first1=Mark |last1=Fimoff |first2=Wayne E. |last2=Bretl |title=MPEG2 Tutorial |date=December 1, 1999 |url=http://www.bretl.com/mpeghtml/STC.HTM |access-date=2016-11-11 |url-status=live |archive-url=https://web.archive.org/web/20161112144128/http://www.bretl.com/mpeghtml/STC.HTM |archive-date=November 12, 2016 }}</ref> These are inserted by the encoder, derived from the system time clock (STC). Simultaneously encoded audio and video streams will not have identical SCR values, however, due to buffering, encoding, jitter, and other delay. ===Program streams=== {{further|MPEG program stream}} [[MPEG program stream|Program Streams]] (PS) are concerned with combining multiple packetized elementary streams (usually just one audio and video PES) into a single stream, ensuring simultaneous delivery, and maintaining synchronization. The PS structure is known as a [[multiplexing|multiplex]], or a [[container format (digital)|container format]]. Presentation time stamps (PTS) exist in PS to correct the inevitable disparity between audio and video SCR values (time-base correction). 90 kHz PTS values in the PS header tell the decoder which video SCR values match which audio SCR values.<ref name=pack_header/> PTS determines when to display a portion of an MPEG program, and is also used by the decoder to determine when data can be discarded from the [[data buffer|buffer]].<ref name=tutorial_pts>{{Citation |first1=Mark |last1=Fimoff |first2=Wayne E. |last2=Bretl |title=MPEG2 Tutorial |date=December 1, 1999 |url=http://www.bretl.com/mpeghtml/PTS.HTM |access-date=2016-11-11 |url-status=live |archive-url=https://web.archive.org/web/20161105163559/http://www.bretl.com/mpeghtml/PTS.HTM |archive-date=November 5, 2016 }}</ref> Either video or audio will be delayed by the decoder until the corresponding segment of the other arrives and can be decoded. PTS handling can be problematic. Decoders must accept multiple ''program streams'' that have been concatenated (joined sequentially). This causes PTS values in the middle of the video to reset to zero, which then begin incrementing again. Such PTS wraparound disparities can cause timing issues that must be specially handled by the decoder. Decoding Time Stamps (DTS), additionally, are required because of B-frames. With B-frames in the video stream, adjacent frames have to be encoded and decoded out-of-order (re-ordered frames). DTS is quite similar to PTS, but instead of just handling sequential frames, it contains the proper time-stamps to tell the decoder when to decode and display the next B-frame (types of frames explained below), ahead of its anchor (P- or I-) frame. Without B-frames in the video, PTS and DTS values are identical.<ref name=tutorial_dts>{{Citation |first1=Mark |last1=Fimoff |first2=Wayne E. |last2=Bretl |title=MPEG2 Tutorial |date=December 1, 1999 |url=http://www.bretl.com/mpeghtml/DTS.HTM |access-date=2016-11-11 |url-status=live |archive-url=https://web.archive.org/web/20161105163603/http://www.bretl.com/mpeghtml/DTS.HTM |archive-date=November 5, 2016 }}</ref> ===Multiplexing=== To generate the PS, the multiplexer will interleave the (two or more) packetized elementary streams. This is done so the packets of the simultaneous streams can be transferred over the same [[Communication channel|channel]] and are guaranteed to both arrive at the decoder at precisely the same time. This is a case of [[time-division multiplexing]]. Determining how much data from each stream should be in each interleaved segment (the size of the interleave) is complicated, yet an important requirement. Improper interleaving will result in buffer underflows or overflows, as the receiver gets more of one stream than it can store (e.g. audio), before it gets enough data to decode the other simultaneous stream (e.g. video). The MPEG [[Video Buffering Verifier]] (VBV) assists in determining if a multiplexed PS can be decoded by a device with a specified data throughput rate and buffer size.<ref name=tutorial_vbv>{{Citation |first1=Mark |last1=Fimoff |first2=Wayne E. |last2=Bretl |title=MPEG2 Tutorial |date=December 1, 1999 |url=http://www.bretl.com/mpeghtml/VBV.HTM |access-date=2016-11-11 |url-status=live |archive-url=https://web.archive.org/web/20161112144717/http://www.bretl.com/mpeghtml/VBV.HTM |archive-date=November 12, 2016 }}</ref> This offers feedback to the multiplexer and the encoder, so that they can change the multiplex size or adjust bitrates as needed for compliance.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)