Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Simulation
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=={{anchor|Simulated world}}Virtual simulation== A '''virtual simulation'''<!--boldface per [[WP:R#PLA]]--> is a category of simulation that uses simulation equipment to create a '''simulated world'''<!--boldface per [[WP:R#PLA]]--> for the user. Virtual simulations allow users to interact with a [[virtual world]]. Virtual worlds operate on platforms of integrated software and hardware components. In this manner, the system can accept input from the user (e.g., body tracking, voice/sound recognition, physical controllers) and produce output to the user (e.g., visual display, aural display, haptic display) .<ref name="SW&CA">{{cite book |author=Sherman, W.R. |author2=Craig, A.B. |title=Understanding Virtual Reality |publisher=Morgan Kaufmann |location=San Francisco, CA |year=2003 |isbn=978-1-55860-353-0 }}</ref> Virtual simulations use the aforementioned modes of interaction to produce a sense of [[immersion (virtual reality)|immersion]] for the user. ===Virtual simulation input hardware=== [[File:Simuladormotocicleta.jpg|right|thumb|Motorcycle simulator of ''Bienal do Automóvel'' exhibition, in [[Belo Horizonte]], Brazil]] There is a wide variety of input hardware available to accept user input for virtual simulations. The following list briefly describes several of them: * ''Body tracking'': The [[motion capture]] method is often used to record the user's movements and translate the captured data into inputs for the virtual simulation. For example, if a user physically turns their head, the motion would be captured by the simulation hardware in some way and translated to a corresponding shift in view within the simulation. ** [[Mo-cap suit|Capture suits]] and/or gloves may be used to capture movements of users body parts. The systems may have sensors incorporated inside them to sense movements of different body parts (e.g., fingers). Alternatively, these systems may have exterior tracking devices or marks that can be detected by external ultrasound, optical receivers or electromagnetic sensors. Internal inertial sensors are also available on some systems. The units may transmit data either wirelessly or through cables. ** [[Eye tracker]]s can also be used to detect eye movements so that the system can determine precisely where a user is looking at any given instant. * ''Physical controllers'': Physical controllers provide input to the simulation only through direct manipulation by the user. In virtual simulations, tactile feedback from physical controllers is highly desirable in a number of simulation environments. ** [[Omnidirectional treadmill]]s can be used to capture the users locomotion as they walk or run. ** High fidelity instrumentation such as instrument panels in virtual aircraft cockpits provides users with actual controls to raise the level of immersion. For example, pilots can use the actual [[global positioning system]] controls from the real device in a simulated cockpit to help them practice procedures with the actual device in the context of the integrated cockpit system. * ''Voice/sound recognition'': This form of interaction may be used either to interact with agents within the simulation (e.g., virtual people) or to manipulate objects in the simulation (e.g., information). Voice interaction presumably increases the level of immersion for the user. ** Users may use headsets with boom microphones, lapel microphones or the room may be equipped with strategically located microphones. ====Current research into user input systems==== Research in future input systems holds a great deal of promise for virtual simulations. Systems such as [[brain–computer interface]]s (BCIs) offer the ability to further increase the level of immersion for virtual simulation users. Lee, Keinrath, Scherer, Bischof, Pfurtscheller<ref>{{cite journal |doi=10.1109/TNSRE.2007.906956 |pmid=18198704 |author=Leeb, R. |author2=Lee, F. |author3=Keinrath, C. |author4=Schere, R. |author5=Bischof, H. |author6=Pfurtscheller, G. |title=Brain-Computer Communication: Motivation, Aim, and Impact of Exploring a Virtual Apartment |journal=IEEE Transactions on Neural Systems and Rehabilitation Engineering |volume=15 |issue=4 |pages=473–481 |year=2007 |s2cid=19998029 |url=http://www0.cs.ucl.ac.uk/research/vr/Projects/PRESENCCIA/Public/presenccia_pub/sharedDocuments/presenccia_publications/Publications/wp4/tug/0221.pdf |archive-url=https://web.archive.org/web/20200320130904/http://www0.cs.ucl.ac.uk/research/vr/Projects/PRESENCCIA/Public/presenccia_pub/sharedDocuments/presenccia_publications/Publications/wp4/tug/0221.pdf |archive-date=2020-03-20 |url-status=live}}</ref> proved that naïve subjects could be trained to use a BCI to navigate a virtual apartment with relative ease. Using the BCI, the authors found that subjects were able to freely navigate the virtual environment with relatively minimal effort. It is possible that these types of systems will become standard input modalities in future virtual simulation systems. ===Virtual simulation output hardware=== There is a wide variety of output hardware available to deliver a stimulus to users in virtual simulations. The following list briefly describes several of them: *''Visual display'': Visual displays provide the visual stimulus to the user. ** Stationary displays can vary from a conventional desktop display to 360-degree wrap-around screens to stereo three-dimensional screens. Conventional desktop displays can vary in size from {{Convert|15|to|60|in}}. Wrap around screens is typically used in what is known as a [[cave automatic virtual environment]] (CAVE). Stereo three-dimensional screens produce three-dimensional images either with or without special glasses—depending on the design. ** [[Head-mounted display]]s (HMDs) have small displays that are mounted on headgear worn by the user. These systems are connected directly into the virtual simulation to provide the user with a more immersive experience. Weight, update rates and field of view are some of the key variables that differentiate HMDs. Naturally, heavier HMDs are undesirable as they cause fatigue over time. If the update rate is too slow, the system is unable to update the displays fast enough to correspond with a quick head turn by the user. Slower update rates tend to cause simulation sickness and disrupt the sense of immersion. Field of view or the angular extent of the world that is seen at a given moment [[field of view]] can vary from system to system and has been found to affect the user's sense of immersion. * ''Aural display'': Several different types of audio systems exist to help the user hear and localize sounds spatially. Special software can be used to produce 3D audio effects [[3D audio]] to create the illusion that sound sources are placed within a defined three-dimensional space around the user. ** Stationary conventional speaker systems may be used to provide dual or multi-channel surround sound. However, external speakers are not as effective as headphones in producing 3D audio effects.<ref name="SW&CA"/> ** Conventional headphones offer a portable alternative to stationary speakers. They also have the added advantages of masking real-world noise and facilitate more effective 3D audio sound effects.<ref name="SW&CA"/> {{Dubious|Aural Display|date=September 2018|Conventional headphones bullet=This claim is dubious. Most Binaural 3D audio is not a simulation or aural reality. This is especially true when the audio is being created by the HRTF process. In this instance, it is an emulation of the amplification system that the Head Related Impulse Response was taken from. A new sound signal is convolved with the HRTF to make it sound as though it were being heard by the mannequin that was used in the recording. It is an emulation of that system.}} * ''Haptic display'': These displays provide a sense of touch to the user ([[haptic technology]]). This type of output is sometimes referred to as force feedback. ** Tactile tile displays use different types of actuators such as inflatable bladders, vibrators, low-frequency sub-woofers, pin actuators and/or thermo-actuators to produce sensations for the user. ** End effector displays can respond to users inputs with resistance and force.<ref name="SW&CA"/> These systems are often used in medical applications for remote surgeries that employ robotic instruments.<ref>Zahraee, A.H., Szewczyk, J., [[Jamie Paik|Paik, J.K.]], Guillaume, M. (2010). Robotic hand-held surgical device: evaluation of end-effector's kinematics and development of proof-of-concept prototypes. Proceedings of the 13th International Conference on Medical Image Computing and Computer-Assisted Intervention, Beijing, China.</ref> * ''Vestibular display'': These displays provide a sense of motion to the user ([[motion simulator]]). They often manifest as motion bases for virtual vehicle simulation such as driving simulators or flight simulators. Motion bases are fixed in place but use actuators to move the simulator in ways that can produce the sensations pitching, yawing or rolling. The simulators can also move in such a way as to produce a sense of acceleration on all axes (e.g., the motion base can produce the sensation of falling).
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)