Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Telerobotics
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Teleoperation == Teleoperation indicates operation of a machine at a distance. It is similar in meaning to the phrase "remote control" but is usually encountered in research, academic and technical environments. It is most commonly associated with robotics and mobile robots but can be applied to a whole range of circumstances in which a device or machine is operated by a person from a distance.<ref>{{cite web |last=Corley|first=Anne-Marie|title=The Reality of Robot Surrogates|url= https://spectrum.ieee.org/robotics/humanoids/the-reality-of-robot-surrogates/0|archive-url= https://archive.today/20130415003449/http://spectrum.ieee.org/robotics/humanoids/the-reality-of-robot-surrogates/0|url-status= dead|archive-date= 15 April 2013|access-date=19 March 2013|publisher=spectrum.ieee.com|date=September 2009}}</ref> [[File:Virtual-Fixtures-USAF-AR.jpg|thumb|Early Telerobotics (Rosenberg, 1992) US Air Force β Virtual Fixtures system ]] Teleoperation is the most standard term, used both in research and technical communities, for referring to operation at a distance. This is opposed to "[[telepresence]]", which refers to the subset of telerobotic systems configured with an immersive interface such that the operator feels present in the remote environment, projecting their presence through the remote robot. One of the first telepresence systems that enabled operators to feel present in a remote environment through all of the primary senses (sight, sound, and touch) was the [[Virtual Fixtures]] system developed at US [[Air Force Research Laboratories Space Vehicles Directorate|Air Force Research Laboratories]] in the early 1990s. The system enabled operators to perform dexterous tasks (inserting pegs into holes) remotely such that the operator would feel as if he or she was inserting the pegs when in fact it was a robot remotely performing the task.<ref>Rosenberg, L.B. (1992). "The Use of Virtual Fixtures As Perceptual Overlays to Enhance Operator Performance in Remote Environments". ''Technical Report AL-TR-0089, USAF Armstrong Laboratory, Wright-Patterson AFB OH, 1992''.</ref><ref>Rosenberg, L.B. (1993). "Virtual Fixtures: Perceptual Overlays for Telerobotic Manipulation". ''In Proc. of the IEEE Annual Int. Symposium on Virtual Reality (1993)'': pp. 76β82,.</ref><ref>Rosenberg, Louis B. [http://spie.org/Publications/Proceedings/Paper/10.1117/12.164901 "Virtual Fixtures as tools to enhance operator performance in Telepresence Environments"]. ''Telemanipulator Technology and Space Telerobotics''. (1993) [[Digital object identifier|doi]]:[https://doi.org/10.1117%2F12.164901 10.1117/12.164901].</ref> A '''telemanipulator''' (or '''teleoperator''') is a device that is controlled remotely by a human operator. In simple cases the controlling operator's command actions correspond directly to actions in the device controlled, as for example in a radio-controlled model aircraft or a tethered deep submergence vehicle. Where communications delays make direct control impractical (such as a remote planetary rover), or it is desired to reduce operator workload (as in a remotely controlled spy or attack aircraft), the device will not be controlled directly, instead being commanded to follow a specified path. At increasing levels of sophistication the device may operate somewhat independently in matters such as obstacle avoidance, also commonly employed in planetary rovers. Devices designed to allow the operator to control a robot at a distance are sometimes called telecheric robotics. Two major components of telerobotics and telepresence are the visual and control applications. A remote camera provides a visual representation of the view from the robot. Placing the robotic camera in a perspective that allows intuitive control is a recent technique that although based in Science Fiction ([[Robert A. Heinlein]]'s 1942 short story "[[Waldo (short story)|Waldo]]") has not been fruitful as the speed, resolution and bandwidth have only recently been adequate to the task of being able to control the robot camera in a meaningful way. Using a [[head mounted display]], the control of the camera can be facilitated by tracking the head as shown in the figure below. This only works if the user feels comfortable with the latency of the system, the lag in the response to movements, the visual representation. Any issues such as, inadequate resolution, latency of the video image, lag in the mechanical and computer processing of the movement and response, and optical distortion due to camera lens and head mounted display lenses, can cause the user '[[simulation sickness|simulator sickness]]' that is exacerbated by the lack of vestibular stimulation with visual representation of motion. Mismatch between the users motions such as registration errors, lag in movement response due to overfiltering, inadequate resolution for small movements, and slow speed can contribute to these problems. The same technology can control the robot, but then the [[eyeβhand coordination]] issues become even more pervasive through the system, and user tension or frustration can make the system difficult to use.{{citation needed|date=November 2018}} The tendency to build robots has been to minimize the [[degrees of freedom (mechanics)|degrees of freedom]] because that reduces the control problems. Recent improvements in computers has shifted the emphasis to more degrees of freedom, allowing robotic devices that seem more intelligent and more human in their motions. This also allows more direct teleoperation as the user can [[motion capture|control the robot with their own motions]].<ref>Miller, Nathan, et al. "[http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.85.2155&rep=rep1&type=pdf Motion capture from inertial sensing for untethered humanoid teleoperation]." Humanoid Robots, 2004 4th IEEE/RAS International Conference on. Vol. 2. IEEE, 2004.</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)