Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Non-linear editing
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== History == {{unreferenced section|date=April 2025}} When [[videotape]]s were first developed in the 1950s, the only way to edit was to physically cut the tape with a razor blade and splice segments together. While the footage excised in this process was not technically destroyed, continuity was lost and the footage was generally discarded. In 1963, with the introduction of the [[Ampex]] Editec, videotape could be edited electronically with a process known as [[linear video editing]] by selectively copying the original footage to another tape called a ''master''. The original recordings are not destroyed or altered in this process. However, since the final product is a copy of the original, there is a generation loss of quality. === First non-linear editor === The first truly non-linear editor, the [[CMX 600]], was introduced in 1971 by [[CMX Systems]], a joint venture between [[CBS]] and [[Memorex]].<ref>{{citation |archive-url=https://web.archive.org/web/20130410092334/http://sundialmedia.com/sait/articles/found_a/heat_f.htm |archive-date=2013-04-10 |url=http://sundialmedia.com/sait/articles/found_a/heat_f.htm |title=The History of Digital Nonlinear Editing |work=Facer Ezine}}</ref><ref>{{citation |archive-url=https://web.archive.org/web/20071021231834/http://nonlinear.info/N4.history.pdf |archive-date=2007-10-21 |url=http://nonlinear.info/N4.history.pdf |title=A Brief History Of Electronic Editing |work=Non Linear}}</ref> It recorded and played back black-and-white analog video recorded in "[[skip-field]]" mode on modified [[disk pack]] drives the size of washing machines that could store a half-hour worth of video & audio for editing. These disk packs were commonly used to store data digitally on mainframe computers of the time. The 600 had a console with two monitors built in. The right monitor, which played the preview video, was used by the editor to make cuts and edit decisions using a [[light pen]]. The editor selected from options superimposed as text over the preview video. The left monitor was used to display the edited video. A DEC [[PDP-11]] computer served as a controller for the whole system. Because the video edited on the 600 was in low-resolution black and white, the 600 was suitable only for offline editing. === The 1980s === Non-linear editing systems were built in the 1980s using computers coordinating multiple [[LaserDisc]]s or banks of VCRs. One example of these tape and disc-based systems was Lucasfilm's [[EditDroid]], which used several LaserDiscs of the same raw footage to simulate random-access editing.{{efn|A compatible system called [[SoundDroid]] was developed for sound post-production. This is considered to be one of the earliest [[digital audio workstation]]s.<ref name="Rubin" />}} EditDroid was demonstrated at NAB in 1984.<ref name="fraser-harrison">{{cite web |title=What was EditDroid? |url=https://fraser-harrison-postproduction.blogspot.com/2013/03/what-was-editdroid.html |author=Fraser Harrison |date=2013-03-14 |access-date=2019-08-29}}<!--ref indicates 1994 NAB #62. this appears to be a typo. first [[NAB Show was]] 1923 so #62 was in 1984.--></ref> EditDroid was the first system to introduce modern concepts in non-linear editing such as timeline editing and clip bins. The LA-based post house Laser Edit{{efn|Laser Edit later merged with Pacific Video as Laser-Pacific.}} also had an in-house system using recordable random-access LaserDiscs. The most popular non-linear system in the 1980s was [[Ediflex]],<ref>{{cite web |url=http://www.articles.adsoft.org/postproduction.htm |archive-url=https://web.archive.org/web/20120302111642/http://www.articles.adsoft.org/postproduction.htm |archive-date=2012-03-02 |author=Richard Seel |title=Developments in Post Production 1946 - 1991}}</ref> which used a bank of [[U-matic]] and [[VHS]] VCRs for offline editing. Ediflex was introduced in 1983 on the Universal series "[[Still the Beaver]]". By 1985 it was used on over 80% of filmed network programs and Cinedco was awarded the [[Technical Emmy]] for "Design and Implementation of Non-Linear Editing for Filmed Programs."<ref>{{cite book|first1=John|last1=Buck|title=Timeline, A History of Editing |publisher=Enriched Books |location=Melbourne|year=1988 | pages = 448 |isbn=978-0-646-49224-7}}</ref><ref>{{cite news |title = NBC LEADS EMMY WINNERS WITH 15 HONORS IN BEHIND-SCENES CATEGORIES |newspaper = Associated Press |location = Padadena, CA |url = https://apnews.com/b0bef37b4cc86b35ea770cb2443d3dc4 |date = September 8, 1986 |access-date = July 30, 2013 |archive-url = https://web.archive.org/web/20150928230637/http://www.apnewsarchive.com/1986/NBC-Leads-Emmy-Winners-With-15-Honors-In-Behind-Scenes-Categories/id-b0bef37b4cc86b35ea770cb2443d3dc4 |archive-date = September 28, 2015 |url-status = live }}</ref> In 1984, [[Montage Picture Processor]] was demonstrated at NAB.<ref name="fraser-harrison"/> Montage used 17 identical copies of a set of film rushes on modified consumer Betamax VCRs. A custom circuit board was added to each deck that enabled frame-accurate switching and playback using vertical interval timecode. Intelligent positioning and sequencing of the source decks provided a simulation of random-access playback of a lengthy edited sequence without any re-recording. The theory was that with so many copies of the rushes, there could always be one machine cued up to replay the next shot in real time. Changing the EDL could be done easily, and the results seen immediately. The first feature edited on the Montage was Sidney Lumet's ''[[Power (1986 film)|Power]]''. Notably, Francis Coppola edited ''[[The Godfather Part III]]'' on the system, and Stanley Kubrick used it for ''[[Full Metal Jacket]]''. It was used on several episodic TV shows (''[[Knots Landing]]'', for one) and on hundreds of commercials and music videos. The original Montage system won an Academy Award for Technical Achievement in 1988.{{citation needed|reason=Doesn't appear in list at [[Academy Scientific and Technical Award]]|date=September 2019}} Montage was reincarnated as Montage II in 1987, and Montage III appeared at NAB in 1991, using digital disk technology, which was considerably less cumbersome than the Betamax system. All of these original systems were slow, cumbersome, and had problems with the limited computer horsepower of the time, but the mid-to-late-1980s saw a trend towards non-linear editing, moving away from film editing on [[Moviola]]s and the linear videotape method using U-matic VCRs. Computer processing advanced sufficiently by the end of the 1980s to enable true digital imagery and has progressed today to provide this capability in personal desktop computers. An example of computing power progressing to make non-linear editing possible was demonstrated in the first all-digital non-linear editing system, the "Harry" effects compositing system manufactured by [[Quantel]] in 1985. Although it was more of a video effects system, it had some non-linear editing capabilities. Most importantly, it could record (and apply effects to) 80 seconds (due to hard disk space limitations) of broadcast-quality uncompressed digital video encoded in 8-bit [[CCIR 601]] format on its built-in hard disk array. === The 1990s === The term ''nonlinear editing'' was formalized in 1991 with the publication of [[Michael Rubin (author)|Michael Rubin's]] ''Nonlinear: A Guide to Digital Film and Video Editing''<ref name="Rubin">{{cite book |title=Nonlinear: A Guide to Digital Film and Video Editing |first1=Michael |last1=Rubin |date=1991 |publisher=Triad Pub. Co. |isbn=0937404853}}</ref>—which popularized this terminology over other terminology common at the time, including ''real-time'' editing, ''random-access'' or ''RA'' editing, ''virtual'' editing, ''electronic film'' editing, and so on.{{Citation needed|date=December 2016}} Non-linear editing with computers as it is known today was first introduced by [[Editing Machines Corp.]] in 1989 with the EMC2 editor, a PC-based non-linear off-line editing system that utilized magneto-optical disks for storage and playback of video, using half-screen-resolution video at 15 frames per second. A couple of weeks later that same year, [[Avid Technology|Avid]] introduced the Avid/1, the first in the line of their [[Media Composer]] systems. It was based on the [[Apple Macintosh]] computer platform ([[Macintosh II]] systems were used) with special hardware and software developed and installed by Avid. The video quality of the Avid/1 (and later [[Media Composer]] systems from the late 1980s) was somewhat low (about VHS quality), due to the use of a very early version of a [[Motion JPEG]] (M-JPEG) codec. It was sufficient, however, to provide a versatile system for offline editing. ''[[Lost in Yonkers (film)|Lost in Yonkers]]'' (1993) was the first film edited with Avid Media Composer, and the first long-form documentary so edited was the HBO program ''Earth and the American Dream'', which won a National Primetime Emmy Award for Editing in 1993. The NewTek [[Video Toaster Flyer]] for the [[Amiga]] included non-linear editing capabilities in addition to processing live video signals. The Flyer used [[hard drive]]s to store video clips and audio, and supported complex scripted playback. The Flyer provided simultaneous dual-channel playback, which let the Toaster's [[video switcher]] perform transitions and other effects on video clips without additional [[Rendering (computer graphics)|rendering]]. The Flyer portion of the Video Toaster/Flyer combination was a complete computer of its own, having its own [[microprocessor]] and [[embedded software]]. Its hardware included three embedded [[SCSI]] controllers. Two of these SCSI buses were used to store video data, and the third to store audio. The Flyer used a proprietary [[wavelet compression]] algorithm known as VTASC, which was well regarded at the time for offering better visual quality than comparable non-linear editing systems using [[motion JPEG]]. Until 1993, the Avid Media Composer was most often used for editing commercials or other small-content and high-value projects. This was primarily because the purchase cost of the system was very high, especially in comparison to the offline tape-based systems that were then in general use. Hard disk storage was also expensive enough to be a limiting factor on the quality of footage that most editors could work with or the amount of material that could be held digitized at any one time.{{efn|In editing facilities rented by the hour or the day, a production's digitized rushes would usually be deleted at the end of the hire, so that the full amount of hard disk storage was available to the next client.}} Up until 1992, the Apple Macintosh computers could access only 50 [[gigabytes]] of storage at once. This limitation was overcome by a digital video R&D team at the [[Disney Channel]] led by [[Rick Eye]]. By February 1993, this team had integrated a long-form system that let the Avid Media Composer running on the Apple Macintosh access over seven [[terabytes]] of digital video data. With instant access to the shot footage of an entire [[movie]], long-form non-linear editing was now possible. The system made its debut at the [[National Association of Broadcasters|NAB]] conference in 1993 in the booths of the three primary sub-system manufacturers, Avid, [[Silicon Graphics]] and [[Sony]]. Within a year, thousands of these systems had replaced [[35mm movie film|35mm film]] editing equipment in major motion picture studios and TV stations worldwide.<ref>{{cite book|url=https://books.google.com/books?id=k9KkIsSb5x0C&q=media+100+editing+suite+history&pg=PA137 |title=Producing Video Podcasts: A Guide for Media Professionals |first1=Richard |last1=Harrington |first2=Mark |last2=Weiser |first3=RHED |last3=Pixel |date=12 February 2019 |publisher=Taylor & Francis |isbn=9780240810294 |via=Google Books}}</ref> Although M-JPEG became the standard codec for NLE during the early 1990s, it had drawbacks. Its high computational requirements ruled out software implementations imposing extra cost and complexity of hardware compression/playback cards. More importantly, the traditional tape [[workflow]] had involved editing from videotape, often in a rented facility. When the editor left the edit suite, they could securely take their tapes with them. But the M-JPEG data rate was too high for systems like Avid/1 on the Apple Macintosh and [[Lightworks]] on PC to store the video on removable storage. The content needed to be stored on fixed hard disks instead. The secure tape paradigm of keeping your content with you was not possible with these fixed disks. Editing machines were often rented from facilities houses on a per-hour basis, and some productions chose to delete their material after each edit session, and then ingest it again the next day to guarantee the security of their content.{{Citation needed|date=November 2009}} In addition, each NLE system had storage limited by its fixed disk capacity. These issues were addressed by a small UK company, [[Eidos Interactive]]. Eidos chose the new [[ARM architecture|ARM]]-based computers from the UK and implemented an editing system, launched in Europe in 1990 at the [[International Broadcasting Convention]]. Because it implemented its own compression software designed specifically for non-linear editing, the Eidos system had no requirement for JPEG hardware and was cheap to produce. The software could decode multiple video and audio streams at once for real-time effects at no extra cost. But most significantly, for the first time, it supported unlimited cheap removable storage. The Eidos Edit 1, Edit 2, and later Optima systems let the editor use ''any'' Eidos system, rather than being tied down to a particular one, and still keep his data secure. The Optima software editing system was closely tied to [[Acorn Computers Ltd|Acorn]] hardware, so when Acorn stopped manufacturing the [[Risc PC]] in the late 1990s, Eidos discontinued the Optima system.{{citation needed|reason=Entire paragraph is unsourced and there is nothing helpful in [[Eidos Interactive]].|date=October 2020}} In the early 1990s, a small American company called Data Translation took what it knew about coding and decoding pictures for the US military and large corporate clients and spent $12 million developing a desktop editor based on its proprietary compression algorithms and off-the-shelf parts. Their aim was to democratize the desktop and take some of Avid's market. In August 1993, [[Media 100]] entered the market, providing would-be editors with a low-cost, high-quality platform.{{citation needed|reason=Entire paragraph is unsourced and there are no helpful sources in [[Media 100]].|date=October 2020}} Around the same period, other competitors provided non-linear systems that required special hardware—typically cards added to the computer system. Fast Video Machine was a PC-based system that first came out as an offline system, and later became more [[online editing]] capable. The [[Imix video cube]] was also a contender for media production companies. The Imix Video Cube had a control surface with faders to allow mixing and shuttle control. Data Translation's Media 100 came with three different JPEG codecs for different types of graphics and many resolutions. [[DOS]]-based [[D/Vision Pro]] was released by TouchVision Systems, Inc. in the mid-1990s and worked with the [[Action Media II]] board. These other companies caused tremendous downward market pressure on Avid. Avid was forced to continually offer lower-priced systems to compete with the Media 100 and other systems. Inspired by the success of Media 100, members of the [[Adobe Premiere Pro|Premiere]] development team left Adobe to start a project called "Keygrip" for Macromedia. Difficulty raising support and money for development led the team to take their non-linear editor to the [[NAB Show]]. After various companies made offers, Keygrip was purchased by Apple as Steve Jobs wanted a product to compete with Adobe Premiere in the desktop video market. At around the same time, Avid—now with Windows versions of its editing software—was considering abandoning the Macintosh platform. Apple released [[Final Cut Pro]] in 1999, and despite not being taken seriously at first by professionals, it has evolved into a serious competitor to entry-level Avid systems. === DV === Another leap came in the late 1990s with the launch of [[DV (video format)|DV-based]] video formats for consumer and professional use. With DV came [[IEEE 1394]] (FireWire/iLink), a simple and inexpensive way of getting video into and out of computers. Users no longer had to convert video from [[analog signal|analog]] to digital—it was recorded as digital to start with—and FireWire offered a straightforward way to transfer video data without additional hardware. With this innovation, editing became a more realistic proposition for software running on standard computers. It enabled desktop editing, producing high-quality results at a fraction of the cost of earlier systems. === HD === In early 2000, the introduction of highly compressed HD formats such as [[HDV]] has continued this trend, making it possible to edit HD material on a standard computer running a software-only editing system. [[Avid Technology|Avid]] is an industry standard used for major feature films, television programs, and commercials.<ref name="Nonlinear-editors">{{cite magazine|date=September 1, 2011|title=Nonlinear editors|url=http://www.tvtechnology.com/multiformat/0112/nonlinear-editors/242232|archive-url=https://web.archive.org/web/20180104191436/http://www.tvtechnology.com/multiformat/0112/nonlinear-editors/242232|archive-date=2018-01-04|magazine=Broadcast engineering}}</ref> Final Cut Pro received a [[Technology & Engineering Emmy Award#2001 Awards|Technology & Engineering Emmy Award]] in 2002. Since 2000, many personal computers include basic non-linear video editing software free of charge. This is the case of Apple [[iMovie]] for the Macintosh platform, various open-source programs like [[Kdenlive]], [[Cinelerra-GG Infinity]] and [[PiTiVi]] for the Linux platform, and [[Windows Movie Maker]] for the Windows platform. This phenomenon has brought low-cost non-linear editing to consumers. === The cloud === The demands of video editing in terms of the volumes of data involved means the proximity of the stored footage being edited to the NLE system doing the editing is governed partly by the capacity of the data connection between the two. The increasing availability of broadband internet combined with the use of lower-resolution copies of original material provides an opportunity to not just review and edit material remotely but also open up access to far more people to the same content at the same time. In 2004 the first [[cloud-based video editor]], known as [[Blackbird (software)|Blackbird]] and based on technology invented by [[Stephen Streater]], was demonstrated at [[International Broadcasting Convention|IBC]] and recognized by the [[Royal Television Society|RTS]] the following year. Since that time a number of other cloud-based editors have become available including systems from [[Avid Technology|Avid]], [[WeVideo]] and [[Grabyo]]. Despite their reliance on a network connection, the need to ingest material before editing can take place, and the use of lower-resolution ''video proxies'', their adoption has grown. Their popularity has been driven largely by efficiencies arising from opportunities for greater collaboration and the potential for cost savings derived from using a shared platform, hiring rather than buying infrastructure, and the use of conventional IT equipment over hardware specifically designed for video editing. === 4K === {{As of|2014}}, [[4K resolution|4K Video]] in NLE was fairly new, but it was being used in the creation of many movies throughout the world, due to the increased use of advanced 4K cameras such as the [[Red Camera]]. Examples of software for this task include [[Avid Technology|Avid]] [[Media Composer]], Apple's [[Final Cut Pro X]], [[Sony Vegas]], [[Adobe Premiere]], [[DaVinci Resolve]], [[Edius]], and [[Cinelerra-GG Infinity]] for Linux.<ref>{{Cite web |last=Radev |first=Vlady |date=2014-07-11 |title=Popular Non-Linear Editors in 2014 Which Support 4K |url=https://www.4kshooters.net/2014/07/11/popular-non-linear-editors-in-2014-which-support-4k/ |access-date=2023-05-26 |website=www.4kshooters.net |language=en-US}}</ref> === 8K === {{asof|2019}} [[8K video]] was relatively new. 8K video editing requires advanced hardware and software capable of handling the standard.{{citation needed|date=August 2021}} ===Image editing=== For imaging software, early works such as [[Kai Krause|HSC Software]]'s Live Picture<ref>{{Cite web |url=http://www.pixiq.com/article/live-picture |title=Live Picture |website=Pixiq |archive-url=https://web.archive.org/web/20130202102219/http://www.pixiq.com/article/live-picture |archive-date=2013-02-02 |url-status=dead }}</ref> brought non-destructive editing to the professional market and current efforts such as [[GEGL]] provide an implementation being used in open-source image editing software.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)