Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Simulation
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Virtual simulation input hardware=== [[File:Simuladormotocicleta.jpg|right|thumb|Motorcycle simulator of ''Bienal do Automóvel'' exhibition, in [[Belo Horizonte]], Brazil]] There is a wide variety of input hardware available to accept user input for virtual simulations. The following list briefly describes several of them: * ''Body tracking'': The [[motion capture]] method is often used to record the user's movements and translate the captured data into inputs for the virtual simulation. For example, if a user physically turns their head, the motion would be captured by the simulation hardware in some way and translated to a corresponding shift in view within the simulation. ** [[Mo-cap suit|Capture suits]] and/or gloves may be used to capture movements of users body parts. The systems may have sensors incorporated inside them to sense movements of different body parts (e.g., fingers). Alternatively, these systems may have exterior tracking devices or marks that can be detected by external ultrasound, optical receivers or electromagnetic sensors. Internal inertial sensors are also available on some systems. The units may transmit data either wirelessly or through cables. ** [[Eye tracker]]s can also be used to detect eye movements so that the system can determine precisely where a user is looking at any given instant. * ''Physical controllers'': Physical controllers provide input to the simulation only through direct manipulation by the user. In virtual simulations, tactile feedback from physical controllers is highly desirable in a number of simulation environments. ** [[Omnidirectional treadmill]]s can be used to capture the users locomotion as they walk or run. ** High fidelity instrumentation such as instrument panels in virtual aircraft cockpits provides users with actual controls to raise the level of immersion. For example, pilots can use the actual [[global positioning system]] controls from the real device in a simulated cockpit to help them practice procedures with the actual device in the context of the integrated cockpit system. * ''Voice/sound recognition'': This form of interaction may be used either to interact with agents within the simulation (e.g., virtual people) or to manipulate objects in the simulation (e.g., information). Voice interaction presumably increases the level of immersion for the user. ** Users may use headsets with boom microphones, lapel microphones or the room may be equipped with strategically located microphones. ====Current research into user input systems==== Research in future input systems holds a great deal of promise for virtual simulations. Systems such as [[brain–computer interface]]s (BCIs) offer the ability to further increase the level of immersion for virtual simulation users. Lee, Keinrath, Scherer, Bischof, Pfurtscheller<ref>{{cite journal |doi=10.1109/TNSRE.2007.906956 |pmid=18198704 |author=Leeb, R. |author2=Lee, F. |author3=Keinrath, C. |author4=Schere, R. |author5=Bischof, H. |author6=Pfurtscheller, G. |title=Brain-Computer Communication: Motivation, Aim, and Impact of Exploring a Virtual Apartment |journal=IEEE Transactions on Neural Systems and Rehabilitation Engineering |volume=15 |issue=4 |pages=473–481 |year=2007 |s2cid=19998029 |url=http://www0.cs.ucl.ac.uk/research/vr/Projects/PRESENCCIA/Public/presenccia_pub/sharedDocuments/presenccia_publications/Publications/wp4/tug/0221.pdf |archive-url=https://web.archive.org/web/20200320130904/http://www0.cs.ucl.ac.uk/research/vr/Projects/PRESENCCIA/Public/presenccia_pub/sharedDocuments/presenccia_publications/Publications/wp4/tug/0221.pdf |archive-date=2020-03-20 |url-status=live}}</ref> proved that naïve subjects could be trained to use a BCI to navigate a virtual apartment with relative ease. Using the BCI, the authors found that subjects were able to freely navigate the virtual environment with relatively minimal effort. It is possible that these types of systems will become standard input modalities in future virtual simulation systems.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)