Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Motion capture
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Movement capture === [[Virtual reality]] and [[augmented reality]] providers, such as [[uSens]] and [[Gestigon]], allow users to interact with digital content in real time by capturing hand motions. This can be useful for training simulations, visual perception tests, or performing virtual walk-throughs in a 3D environment. Motion capture technology is frequently used in [[digital puppetry]] systems to drive computer-generated characters in real time. [[Gait analysis]] is one application of motion capture in [[clinical medicine]]. Techniques allow clinicians to evaluate human motion across several biomechanical factors, often while streaming this information live into analytical software. One innovative use is pose detection, which can empower patients during post-surgical recovery or rehabilitation after injuries. This approach enables continuous monitoring, real-time guidance, and individually tailored programs to enhance patient outcomes.<ref>{{Cite web|url=https://www.abtosoftware.com/expertise/ai-based-pose-detection|title=AI based pose detection for physical rehabilitation software}}</ref> Some physical therapy clinics utilize motion capture as an objective way to quantify patient progress.<ref>{{Cite web|url=https://www.eumotus.com|title=Markerless Motion Capture {{!}} EuMotus|website=Markerless Motion Capture {{!}} EuMotus|language=en|access-date=2018-10-12}}</ref> During the filming of James Cameron's [[Avatar (2009 film)|''Avatar'']] all of the scenes involving motion capture were directed in real-time using [[Autodesk MotionBuilder]] software to render a screen image which allowed the director and the actor to see what they would look like in the movie, making it easier to direct the movie as it would be seen by the viewer. This method allowed views and angles not possible from a pre-rendered animation. Cameron was so proud of his results that he invited [[Steven Spielberg]] and [[George Lucas]] on set to view the system in action. In Marvel's ''[[The Avengers (2012 film)|The Avengers]]'', Mark Ruffalo used motion capture so he could play his character [[Bruce Banner (Marvel Cinematic Universe)|the Hulk]], rather than have him be only CGI as in previous films, making Ruffalo the first actor to play both the human and the Hulk versions of Bruce Banner. [[FaceRig]] software uses facial recognition technology from ULSee.Inc to map a player's facial expressions and the body tracking technology from Perception Neuron to map the body movement onto a 2D or 3D character's motion on-screen.<ref>{{cite web|url=http://www.polygon.com/2014/6/30/5858610/this-facial-recognition-software-lets-you-be-octodad|title=This facial recognition software lets you be Octodad|first=Alexa Ray|last=Corriea|date=30 June 2014|access-date=4 January 2017|via=www.polygon.com}}</ref><ref>{{cite web|url=http://kotaku.com/turn-your-human-face-into-a-video-game-character-1490049650|title=Turn Your Human Face Into A Video Game Character|first=Luke|last=Plunkett|work=kotaku.com|date=27 December 2013 |access-date=4 January 2017}}</ref> During ''[[Game Developers Conference]]'' 2016 in San Francisco ''[[Epic Games]]'' demonstrated full-body motion capture live in Unreal Engine. The whole scene, from the upcoming game ''[[Hellblade: Senua's Sacrifice|Hellblade]]'' about a woman warrior named Senua, was rendered in real-time. The keynote<ref>{{cite web|url=https://www.fxguide.com/featured/put-your-digital-game-face-on/|title=Put your (digital) game face on|date=24 April 2016|work=fxguide.com|access-date=4 January 2017}}</ref> was a collaboration between ''[[Unreal Engine]]'', ''[[Ninja Theory]]'', ''[[3Lateral]]'', ''Cubic Motion'', ''IKinema'' and ''[[Xsens]]''. In 2020, the [[List of Olympic medalists in figure skating|two-time Olympic figure skating champion]] [[Yuzuru Hanyu]] graduated from [[Waseda University]]. In his thesis, using data provided by 31 sensors placed on his body, he analysed his jumps. He evaluated the use of technology both in order to improve the scoring system and to help skaters improve their jumping technique.<ref name="SA 23082020">{{cite web|access-date=2 September 2023|language=ja|title=羽生結弦"動いたこと"は卒論完成 24時間テレビにリモート出演 (Yuzuru Hanyu completes graduation thesis on "moving things" and appears remotely on TV 24 hours a day)|url=https://www.sponichi.co.jp/sports/news/2020/08/23/kiji/20200823s00079000319000c.html}}<!-- auto-translated by Module:CS1 translator --></ref><ref name="NS23082020">{{cite web|access-date=2 September 2023|language=ja|title=羽生結弦が卒業論文を公開 24時間テレビに出演 (Yuzuru Hanyu publishes graduation thesis and appears on 24-hour TV)|url=https://www.nikkansports.com/entertainment/news/202008230000333.html}}<!-- auto-translated by Module:CS1 translator --></ref> In March 2021 a summary of the thesis was published in the academic journal.<ref name="WUR2021">{{cite thesis|access-date=2 September 2023|language=ja|title=無線・慣性センサー式モーションキャプチャシステムのフィギュアスケートでの利活用に関するフィージビリティスタディ (A feasibility study on the use of wireless and inertial sensor motion capture systems in figure skating)|date=18 March 2021 |publisher=Waseda University |url=https://waseda.repo.nii.ac.jp/?action=pages_view_main&active_action=repository_view_main_item_detail&item_id=64787&item_no=1&page_id=13&block_id=21 |last1=Hanyu |first1=Yuzuru |last2=羽生 |first2=結弦 }}<!-- auto-translated by Module:CS1 translator --></ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)