Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Film colorization
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Digital colorization=== Computerized colorization began in the 1970s using the technique invented by Wilson Markle. These early attempts at colorization have soft contrast and fairly pale, flat, washed-out color; however, the technology has improved steadily since the 1980s. To perform digital colorization, a digitized copy of the best [[monochrome|black and white]] film print available is used. With the aid of computer software, technicians associate a range of gray levels to each object and indicate to the computer any movement of the objects within a shot. The software is also capable of sensing variations in the light level from frame-to-frame and correcting it if necessary. The technician selects a color for each object based on common "memory" colors—such as blue sky, white clouds, flesh tones, and green grass—and on any information about colors used in the movie. If color publicity stills or props are available to examine, authentic colors may be applied. In the absence of any better information, technicians may choose colors that fit the gray level and are consistent with what a director might have wanted for the scene. The software associates a variation of the basic color with each gray level in the object, while keeping intensity levels the same as in the monochrome original. The software then follows each object from frame to frame, applying the same color until the object leaves the frame. As new objects come into the frame, the technician must associate colors to each new object in the same way as described above.<ref>{{cite web |url=http://www.museum.tv/eotvsection.php?entrycode=colorization |title=COLORIZATION |access-date=2007-01-01 |archive-date=2013-05-07 |archive-url=https://web.archive.org/web/20130507140446/http://www.museum.tv/eotvsection.php?entrycode=colorization |url-status=dead }}</ref> This technique was patented in 1991.<ref>{{cite web |url=http://brevets-patents.ic.gc.ca/opic-cipo/cpd/eng/patent/1291260/summary.html |title=Canadian Intellectual Property Office |access-date=2007-01-01 |archive-date=2011-10-08 |archive-url=https://web.archive.org/web/20111008052408/http://brevets-patents.ic.gc.ca/opic-cipo/cpd/eng/patent/1291260/summary.html |url-status=live }}</ref> In order to colorize a still image, an artist typically begins by dividing the image into regions, and then assigning a color to each region. This approach, also known as the [[segmentation (image processing)|segmentation]] method, is laborious and time-consuming, especially in the absence of fully automatic [[algorithm]]s to identify fuzzy or complex region boundaries, such as those between a subject's hair and face. Colorization of moving images also requires [[motion compensation]], tracking regions as movement occurs from one frame to the next. Several companies claim to have produced automatic region-tracking algorithms: * Legend Films describes their core technology as pattern recognition and background compositing that moves and morphs foreground and background masks from frame to frame. In the process, backgrounds are colorized separately in a single composite frame which functions as a visual database of a cut, and includes all offset data on each camera movement. Once the foreground areas are colorized, background masks are applied frame-to-frame. * Timebrush describes a process based on [[Artificial neural network|neural net]] technology that produces saturated and crisp colors with clear lines and no apparent spill-over. The process is cost effective because it relies on computers rather than human effort, and is equally suitable for low-budget colorization and broadcast-quality or theatrical projection. * A team at the [[Hebrew University of Jerusalem]]'s Benin School of Computer Science and Engineering describe their method as an interactive process that does not require precise manual region detection, nor accurate tracking; it is based on the premise that adjacent pixels in space and time that have similar gray levels should also have similar colors. *At the [[University of Minnesota]], a color propagation method was developed that uses [[Distance (graph theory)|geodesic distance]].<ref>{{cite web |url=http://moon.felk.cvut.cz/~sykorad/literature.html |archive-url=https://web.archive.org/web/20101106161817/http://moon.felk.cvut.cz/~sykorad/literature.html |archive-date=2010-11-06 |title=Annotation of colorization methods |access-date=2007-01-01 |author=Daniel Sýkora }}</ref> * A highly labor-intensive process employed by the UK-based film and video colorization artist [[Stuart Humphryes]], in conjunction with video restoration company SVS Resources, was employed by the [[BBC]] in 2013 for the commercial release of two ''[[Doctor Who]]'' serials: the first episode of ''[[The Mind of Evil]]'' and newly discovered black and white footage in the director's cut of ''[[Terror of the Zygons]]''. For these projects, approximately 7,000 key-frames (approximately every 5th [[PAL]] video frame) were fully colorized by hand, without the use of masks, layers, or the [[segmentation (image processing)|segmentation]] method. These were then utilized by SVS Resources to interpolate the color across the intervening surrounding frames using a part computerized/part manual process.<ref>{{cite web |url=http://babelcolour.com/dvd-work/mind-of-evil/ |title=Babelcolour Video Colourisation |access-date=2013-11-15 |date=8 May 2013 |archive-date=2016-04-15 |archive-url=https://web.archive.org/web/20160415184248/http://babelcolour.com/dvd-work/mind-of-evil/ |url-status=live }}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)