Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Codec
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Media codecs== {{Further|Video codec|Audio codec}} Two principal techniques are used in codecs, pulse-code modulation and [[delta modulation]]. Codecs are often designed to emphasize certain aspects of the media to be encoded. For example, a digital video (using a [[DV (video format)|DV]] codec) of a sports event needs to encode motion well but not necessarily exact colors, while a video of an art exhibit needs to encode color and surface texture well. Audio codecs for cell phones need to have very low [[Latency (audio)|latency]] between source encoding and playback. In contrast, audio codecs for recording or broadcasting can use high-latency [[audio compression (data)|audio compression]] techniques to achieve higher fidelity at a lower bit rate. There are thousands of audio and video codecs, ranging in cost from free to hundreds of dollars or more. This variety of codecs can create compatibility and obsolescence issues. The impact is lessened for older formats, for which free or nearly-free codecs have existed for a long time. The older formats are often ill-suited to modern applications, however, such as playback on small portable devices. For example, raw uncompressed [[PCM audio]] (44.1 kHz, 16-bit stereo, as represented on an audio CD or in a .wav or .aiff file) has long been a standard across multiple platforms, but its transmission over networks is slow and expensive compared with more modern compressed formats, such as [[Opus (audio format)|Opus]] and MP3. Many [[multimedia]] data streams contain both [[Sound|audio]] and [[video]], and often some metadata that permits synchronization of audio and video. Each of these three streams may be handled by different programs, processes, or hardware; but for the multimedia data streams to be useful in stored or transmitted form, they must be encapsulated together in a [[container format]]. Lower [[bitrate]] codecs allow more users, but they also have more distortion. Beyond the initial increase in distortion, lower bit rate codecs also achieve their lower bit rates by using more complex algorithms that make certain assumptions, such as those about the media and the packet loss rate. Other codecs may not make those same assumptions. When a user with a low bitrate codec talks to a user with another codec, additional distortion is introduced by each [[transcoding]]. [[Audio Video Interleave]] (AVI) is sometimes erroneously described as a codec, but AVI is actually a container format, while a codec is a software or hardware tool that encodes or decodes audio or video into or from some audio or video format. Audio and video encoded with many codecs might be put into an AVI container, although AVI is not an [[ISO standard]]. There are also other well-known container formats, such as [[Ogg]], [[Advanced Systems Format|ASF]], [[QuickTime]], [[RealMedia]], [[Matroska]], and [[DivX Media Format]]. [[MPEG transport stream]], [[MPEG program stream]], [[MP4]], and [[ISO base media file format]] are examples of container formats that are ISO standardized.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)