Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Motion capture
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Applications== {{more citations needed section| date=February 2013}} There are many applications of motion capture. The most common are for video games, movies, and movement capture, however there is a research application for this technology being used at Purdue University in robotics development. === Video games === [[Video games]] often use motion capture to animate athletes, [[martial artists]], and other in-game characters.<ref>Jon Radoff, Anatomy of an MMORPG, {{cite web |url=http://radoff.com/blog/2008/08/22/anatomy-of-an-mmorpg/ |title=Anatomy of an MMORPG |access-date=2009-11-30 |url-status=dead |archive-url=https://web.archive.org/web/20091213053756/http://radoff.com/blog/2008/08/22/anatomy-of-an-mmorpg/ |archive-date=2009-12-13 }}</ref><ref name="GPro82">{{cite magazine|title=Hooray for Hollywood! Acclaim Studios|magazine=[[GamePro]]|issue=82|publisher=[[International Data Group|IDG]]|date=July 1995|pages=28–29}}</ref> As early as 1988, an early form of motion capture was used to animate the [[2D computer graphics|2D]] [[player characters]] of [[Martech]]'s video game ''[[Vixen (video game)|Vixen]]'' (performed by model [[Corinne Russell]])<ref>{{cite magazine|magazine=[[Retro Gamer]]|title=Martech Games - The Personality People|page=51|issue=133|first=Graeme|last=Mason|url=https://issuu.com/michelfranca/docs/retro_gamer____133}}</ref> and [[Magical Company]]'s 2D arcade [[fighting game]] ''Last Apostle Puppet Show'' (to animate digitized [[Sprite (computer graphics)|sprites]]).<ref>{{cite web |title=Pre-Street Fighter II Fighting Games |url=http://www.hardcoregaming101.net/fighters/fighters8.htm |website=Hardcore Gaming 101 |page=8 |access-date=26 November 2021}}</ref> Motion capture was later notably used to animate the [[3D computer graphics|3D]] character models in the [[Sega Model 1|Sega Model]] [[arcade games]] ''[[Virtua Fighter (video game)|Virtua Fighter]]'' (1993)<ref name="CVG158">{{cite magazine |url=https://retrocdn.net/images/8/84/CVG_UK_158.pdf#page=12 |title=Sega Saturn exclusive! Virtua Fighter: fighting in the third dimension |magazine=[[Computer and Video Games]] |publisher=[[Future plc]] |issue=158 (January 1995) |date=15 December 1994 |pages=12–3, 15–6, 19}}</ref><ref name="Maximum">{{cite journal|title=Virtua Fighter|journal=Maximum: The Video Game Magazine|issue=1|publisher=[[Emap International Limited]]|date=October 1995|pages=142–3}}</ref> and ''[[Virtua Fighter 2]]'' (1994).<ref>{{cite web|last=Wawro|first=Alex|title=Yu Suzuki Recalls Using Military Tech to Make Virtua Fighter 2 |url=https://www.gamedeveloper.com/business/yu-suzuki-recalls-using-military-tech-to-make-i-virtua-fighter-2-i-|website=[[Gamasutra]]|access-date=18 August 2016|date=October 23, 2014}}</ref> In mid-1995, developer/publisher [[Acclaim Entertainment]] had its own in-house motion capture studio built into its headquarters.<ref name="GPro82" /> [[Namco]]'s 1995 arcade game ''[[Soul Edge]]'' used passive optical system markers for motion capture.<ref>{{cite web |url=http://www.motioncapturesociety.com/resources/industry-history |title=History of Motion Capture |publisher=Motioncapturesociety.com |access-date=2013-08-10 |archive-url=https://web.archive.org/web/20181023162411/http://www.motioncapturesociety.com/resources/industry-history |archive-date=2018-10-23 |url-status=dead }}</ref> Motion capture also uses athletes in based-off animated games, such as [[Naughty Dog]]'s [[Crash Bandicoot (video game)|Crash Bandicoot]], [[Insomniac Games]]' [[Spyro the Dragon]], and [[Rare (company)|Rare]]'s [[Star Fox Adventures#Development|Dinosaur Planet]]. === Robotics === Indoor positioning is another application for optical motion capture systems. Robotics researchers often use motion capture systems when developing and evaluating control, estimation, and perception algorithms and hardware. In outdoor spaces, it's possible to achieve accuracy to the centimeter by using the Global Navigation Satellite System ([[Satellite navigation|GNSS]]) together with Real-Time Kinematics ([[Real-time kinematic positioning|RTK]]). However, this reduces significantly when there is no line-of-sight to the satellites — such as in indoor environments. The majority of vendors selling commercial optical motion capture systems provide accessible open source drivers that integrate with the popular Robotic Operating System ([[Robot Operating System|ROS]]) framework, allowing researchers and developers to effectively test their robots during development. In the field of aerial robotics research, motion capture systems are widely used for positioning as well. Regulations on airspace usage limit how feasible outdoor experiments can be conducted with Unmanned Aerial Systems ([[Unmanned aerial vehicle#Terminology|UAS]]). Indoor tests can circumvent such restrictions. Many labs and institutions around the world have built indoor motion capture volumes for this purpose. Purdue University houses the world's largest indoor motion capture system, inside the Purdue UAS Research and Test (PURT) facility. PURT is dedicated to UAS research, and provides tracking volume of 600,000 cubic feet using 60 motion capture cameras.<ref>{{Cite web |title=Purdue's Home for Drone Systems Engineering |url=https://engineering.purdue.edu/AAE/Aerogram/2021-fall/articles/ac-very-unique-facility |access-date=2023-09-18 |website=Aerogram Magazine - 2021-2022 |language=en}}</ref> The optical motion capture system is able to track targets in its volume with millimeter accuracy, effectively providing the true position of targets — the "ground truth" baseline in research and development. Results derived from other sensors and algorithms can then be compared to the ground truth data to evaluate their performance. === Movies === Movies use motion capture for CGI effects, in some cases replacing traditional cel animation, and for completely [[computer-generated imagery|CGI]] creatures, such as [[Gollum]], [[The Mummy (1999 film)|The Mummy]], [[Peter Jackson's King Kong|King Kong]], [[Davy Jones (Pirates of the Caribbean)|Davy Jones]] from ''[[Pirates of the Caribbean (film series)|Pirates of the Caribbean]]'', the [[Pandoran biosphere#Na'vi|Na'vi]] from the film [[Avatar (2009 film)|''Avatar'']], and Clu from ''[[Tron: Legacy]]''. The Great Goblin, the three [[Troll (Middle-earth)#Troll types|Stone-trolls]], many of the orcs and goblins in the 2012 film ''[[The Hobbit: An Unexpected Journey]]'', and [[Smaug]] were created using motion capture. The film ''[[Batman Forever]]'' (1995) used some motion capture for certain visual effects. [[Warner Bros.]] had acquired motion capture technology from [[arcade video game]] company Acclaim Entertainment for use in the film's production.<ref>{{cite magazine |title=Coin-Op News: Acclaim technology tapped for "Batman" movie |magazine=[[Play Meter]] |date=October 1994 |volume=20 |issue=11 |page=22 |url=https://archive.org/details/play-meter-volume-20-number-11-october-1994/Play%20Meter%20-%20Volume%2020%2C%20Number%2011%20-%20October%201994/page/22}}</ref> Acclaim's 1995 [[Batman Forever (video game)|video game of the same name]] also used the same motion capture technology to animate the digitized [[Sprite (computer graphics)|sprite]] graphics.<ref>{{cite magazine |title=Acclaim Stakes its Claim |magazine=RePlay |date=January 1995 |volume=20 |issue=4 |page=71 |url=https://archive.org/details/re-play-volume-20-issue-no.-4-january-1995/RePlay%20-%20Volume%2020%2C%20Issue%20No.%204%20-%20January%201995/page/n68}}</ref> The 1999 film ''[[Star Wars: Episode I – The Phantom Menace]]'' was the first feature-length film to include a main character created ([[Jar Jar Binks]], played by [[Ahmed Best]]), using motion capture. The 2000 [[India]]n-[[United States|American]] film ''[[Sinbad: Beyond the Veil of Mists]]'' was the first feature-length film made primarily with motion capture, although many character animators also worked on the film, which had a very limited release. 2001's ''[[Final Fantasy: The Spirits Within]]'' was the first widely released movie to be made with motion capture technology. Despite its poor box-office intake, supporters of motion capture technology took notice. ''[[Total Recall (1990 film)|Total Recall]]'' had already used the technique, in the scene of the x-ray scanner and the skeletons. ''[[The Lord of the Rings: The Two Towers]]'' was the first feature film to utilize a real-time motion capture system. This method streamed the actions of actor [[Andy Serkis]] into the computer-generated imagery skin of Gollum / Smeagol as it was being performed.<ref>{{cite magazine|last1=Savage|first1=Annaliza|title=Gollum Actor: How New Motion-Capture Tech Improved The Hobbit|url=https://www.wired.com/2012/12/andy-serkis-interview/|magazine=[[Wired (website)|Wired]]|access-date=29 January 2017|date=12 July 2012}}</ref> Storymind Entertainment, which is an independent [[Ukrainians|Ukrainian]] studio, created a [[neo-noir]] [[Third-person shooter|third-person]] / shooter video game called ''[[My Eyes On You (video game)|My Eyes On You]],'' using motion capture in order to animate its main character, Jordan Adalien, and along with non-playable characters.<ref>{{Cite web |title=INTERVIEW: Storymind Entertainment Talks About Upcoming 'My Eyes On You' |url=https://www.thatmomentin.com/my-eyes-on-you/ |access-date=2022-09-24 |website=That Moment In |date=29 October 2017 |language=en-US}}</ref> Of the three nominees for the 2006 [[Academy Award for Best Animated Feature]], two of the nominees (''[[Monster House (film)|Monster House]]'' and the winner ''[[Happy Feet]]'') used motion capture, and only [[Walt Disney Pictures|Disney]]'''·'''[[Pixar]]'s ''[[Cars (film)|Cars]]'' was animated without it. In the ending credits of [[Pixar]]'s film ''[[Ratatouille (film)|Ratatouille]]'', a stamp appears labelling the film as "100% Genuine Animation – No Motion Capture!" Since 2001, motion capture has been used extensively to simulate or approximate the look of live-action theater, with nearly [[Photorealism|photorealistic]] digital character models. ''[[The Polar Express (film)|The Polar Express]]'' used motion capture to allow [[Tom Hanks]] to perform as several distinct digital characters (in which he also provided the voices). The 2007 adaptation of the saga ''[[Beowulf (2007 film)|Beowulf]]'' animated digital characters whose appearances were based in part on the actors who provided their motions and voices. James Cameron's highly popular ''[[Avatar (2009 film)|Avatar]]'' used this technique to create the Na'vi that inhabit Pandora. [[The Walt Disney Company]] has produced [[Robert Zemeckis]]'s ''[[A Christmas Carol (2009 film)|A Christmas Carol]]'' using this technique. In 2007, Disney acquired Zemeckis' [[ImageMovers Digital]] (that produces motion capture films), but then closed it in 2011, after a [[box office failure]] of ''[[Mars Needs Moms]]''. Television series produced entirely with motion capture animation include ''[[Et Dieu créa... Laflaque|Laflaque]]'' in Canada, ''[[Sprookjesboom]]'' and ''{{ill|Cafe de Wereld|nl|Cafe de Wereld|vertical-align=sup}}'' in The Netherlands, and ''[[Headcases]]'' in the UK. === Movement capture === [[Virtual reality]] and [[augmented reality]] providers, such as [[uSens]] and [[Gestigon]], allow users to interact with digital content in real time by capturing hand motions. This can be useful for training simulations, visual perception tests, or performing virtual walk-throughs in a 3D environment. Motion capture technology is frequently used in [[digital puppetry]] systems to drive computer-generated characters in real time. [[Gait analysis]] is one application of motion capture in [[clinical medicine]]. Techniques allow clinicians to evaluate human motion across several biomechanical factors, often while streaming this information live into analytical software. One innovative use is pose detection, which can empower patients during post-surgical recovery or rehabilitation after injuries. This approach enables continuous monitoring, real-time guidance, and individually tailored programs to enhance patient outcomes.<ref>{{Cite web|url=https://www.abtosoftware.com/expertise/ai-based-pose-detection|title=AI based pose detection for physical rehabilitation software}}</ref> Some physical therapy clinics utilize motion capture as an objective way to quantify patient progress.<ref>{{Cite web|url=https://www.eumotus.com|title=Markerless Motion Capture {{!}} EuMotus|website=Markerless Motion Capture {{!}} EuMotus|language=en|access-date=2018-10-12}}</ref> During the filming of James Cameron's [[Avatar (2009 film)|''Avatar'']] all of the scenes involving motion capture were directed in real-time using [[Autodesk MotionBuilder]] software to render a screen image which allowed the director and the actor to see what they would look like in the movie, making it easier to direct the movie as it would be seen by the viewer. This method allowed views and angles not possible from a pre-rendered animation. Cameron was so proud of his results that he invited [[Steven Spielberg]] and [[George Lucas]] on set to view the system in action. In Marvel's ''[[The Avengers (2012 film)|The Avengers]]'', Mark Ruffalo used motion capture so he could play his character [[Bruce Banner (Marvel Cinematic Universe)|the Hulk]], rather than have him be only CGI as in previous films, making Ruffalo the first actor to play both the human and the Hulk versions of Bruce Banner. [[FaceRig]] software uses facial recognition technology from ULSee.Inc to map a player's facial expressions and the body tracking technology from Perception Neuron to map the body movement onto a 2D or 3D character's motion on-screen.<ref>{{cite web|url=http://www.polygon.com/2014/6/30/5858610/this-facial-recognition-software-lets-you-be-octodad|title=This facial recognition software lets you be Octodad|first=Alexa Ray|last=Corriea|date=30 June 2014|access-date=4 January 2017|via=www.polygon.com}}</ref><ref>{{cite web|url=http://kotaku.com/turn-your-human-face-into-a-video-game-character-1490049650|title=Turn Your Human Face Into A Video Game Character|first=Luke|last=Plunkett|work=kotaku.com|date=27 December 2013 |access-date=4 January 2017}}</ref> During ''[[Game Developers Conference]]'' 2016 in San Francisco ''[[Epic Games]]'' demonstrated full-body motion capture live in Unreal Engine. The whole scene, from the upcoming game ''[[Hellblade: Senua's Sacrifice|Hellblade]]'' about a woman warrior named Senua, was rendered in real-time. The keynote<ref>{{cite web|url=https://www.fxguide.com/featured/put-your-digital-game-face-on/|title=Put your (digital) game face on|date=24 April 2016|work=fxguide.com|access-date=4 January 2017}}</ref> was a collaboration between ''[[Unreal Engine]]'', ''[[Ninja Theory]]'', ''[[3Lateral]]'', ''Cubic Motion'', ''IKinema'' and ''[[Xsens]]''. In 2020, the [[List of Olympic medalists in figure skating|two-time Olympic figure skating champion]] [[Yuzuru Hanyu]] graduated from [[Waseda University]]. In his thesis, using data provided by 31 sensors placed on his body, he analysed his jumps. He evaluated the use of technology both in order to improve the scoring system and to help skaters improve their jumping technique.<ref name="SA 23082020">{{cite web|access-date=2 September 2023|language=ja|title=羽生結弦"動いたこと"は卒論完成 24時間テレビにリモート出演 (Yuzuru Hanyu completes graduation thesis on "moving things" and appears remotely on TV 24 hours a day)|url=https://www.sponichi.co.jp/sports/news/2020/08/23/kiji/20200823s00079000319000c.html}}<!-- auto-translated by Module:CS1 translator --></ref><ref name="NS23082020">{{cite web|access-date=2 September 2023|language=ja|title=羽生結弦が卒業論文を公開 24時間テレビに出演 (Yuzuru Hanyu publishes graduation thesis and appears on 24-hour TV)|url=https://www.nikkansports.com/entertainment/news/202008230000333.html}}<!-- auto-translated by Module:CS1 translator --></ref> In March 2021 a summary of the thesis was published in the academic journal.<ref name="WUR2021">{{cite thesis|access-date=2 September 2023|language=ja|title=無線・慣性センサー式モーションキャプチャシステムのフィギュアスケートでの利活用に関するフィージビリティスタディ (A feasibility study on the use of wireless and inertial sensor motion capture systems in figure skating)|date=18 March 2021 |publisher=Waseda University |url=https://waseda.repo.nii.ac.jp/?action=pages_view_main&active_action=repository_view_main_item_detail&item_id=64787&item_no=1&page_id=13&block_id=21 |last1=Hanyu |first1=Yuzuru |last2=羽生 |first2=結弦 }}<!-- auto-translated by Module:CS1 translator --></ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)