Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Autonomous robot
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Components and criteria of robotic autonomy== {{More citations needed section|date=December 2020}} === Self-maintenance === <!-- [[Image:PeopleGuide.jpg|left|thumb|Exteroceptive sensors: 1. blue laser rangefinder senses up to 360 distance readings in a 180-degree slice; 2. 24 round golden ultrasonic sensors sample range readings in a 15-degree cone; 3. ten touch panels along the bottom detect shoes and other low-lying objects. 4. break beams between the lower and upper segments sense tables and other mid-level obstacles.]] --> The first requirement for complete physical autonomy is the ability for a robot to take care of itself. Many of the battery-powered robots on the market today can find and connect to a charging station, and some toys like Sony's ''[[Aibo]]'' are capable of self-docking to charge their batteries. Self-maintenance is based on "[[proprioception]]", or sensing one's own internal status. In the battery charging example, the robot can tell proprioceptively that its batteries are low, and it then seeks the charger. Another common proprioceptive sensor is for heat monitoring. Increased proprioception will be required for robots to work autonomously near people and in harsh environments. Common proprioceptive sensors include thermal, optical, and haptic sensing, as well as the [[Hall effect]] (electric). [[Image:MobileEyesIntel.jpg|right|thumb|Robot GUI display showing battery voltage and other proprioceptive data in lower right-hand corner. The display is for user information only. Autonomous robots monitor and respond to [[Proprioception|proprioceptive]] sensors without human intervention to keep themselves safe and operating properly.]] === Sensing the environment === Exteroception is [[robotic sensing|sensing]] things about the environment. Autonomous robots must have a range of environmental sensors to perform their task and stay out of trouble. The autonomous robot can recognize sensor failures and minimize the impact on the performance caused by failures.<ref>{{Cite journal |last=Ferrell |first=Cynthia |date=March 1994 |title=Failure Recognition and Fault Tolerance of an Autonomous Robot |url=http://dx.doi.org/10.1177/105971239400200403 |journal=Adaptive Behavior |volume=2 |issue=4 |pages=375β398 |doi=10.1177/105971239400200403 |s2cid=17611578 |issn=1059-7123}}</ref> *Common exteroceptive sensors include the [[electromagnetic spectrum]], sound, touch, chemical (smell, odor), temperature, range to various objects, and altitude. Some robotic lawn mowers will adapt their programming by detecting the speed in which grass grows as needed to maintain a perfectly cut lawn, and some vacuum cleaning robots have dirt detectors that sense how much dirt is being picked up and use this information to tell them to stay in one area longer. === Task performance === The next step in autonomous behavior is to actually perform a physical task. A new area showing commercial promise is domestic robots, with a flood of small vacuuming robots beginning with [[iRobot]] and [[Electrolux]] in 2002. While the level of intelligence is not high in these systems, they navigate over wide areas and pilot in tight situations around homes using contact and non-contact sensors. Both of these robots use proprietary algorithms to increase coverage over simple random bounce. The next level of autonomous task performance requires a robot to perform conditional tasks. For instance, security robots can be programmed to detect intruders and respond in a particular way depending upon where the intruder is. For example, [[Amazon (company)|Amazon]] launched its Astro for home monitoring, security and eldercare in September 2021.<ref>{{cite web |last1=Heater |first1=Brian |title=Why Amazon built a home robot |url=https://techcrunch.com/2021/09/28/why-amazon-built-a-home-robot/ |website=Tech Crunch |date=28 September 2021 |access-date=29 September 2021}}</ref> === Autonomous navigation === ==== Indoor navigation ==== For a robot to associate behaviors with a place ([[robot localization|localization]]) requires it to know where it is and to be able to navigate point-to-point. Such navigation began with wire-guidance in the 1970s and progressed in the early 2000s to beacon-based [[triangulation]]. Current commercial robots autonomously navigate based on sensing natural features. The first commercial robots to achieve this were Pyxus' HelpMate hospital robot and the CyberMotion guard robot, both designed by robotics pioneers in the 1980s. These robots originally used manually created [[Computer-aided design|CAD]] floor plans, sonar sensing and wall-following variations to navigate buildings. The next generation, such as MobileRobots' [[PatrolBot]] and autonomous wheelchair,<ref>{{cite web |url=https://www.researchgate.net/publication/236882346 |title=Autonomous Wheelchair: Concept and Exploration |first1=Rafael |last1=Berkvens |first2=Wouter |last2=Rymenants |first3=Maarten |last3=Weyn |first4=Simon |last4=Sleutel |first5=Willy |last5=Loockx |work=AMBIENT 2012 : The Second International Conference on Ambient Computing, Applications, Services and Technologies |via=[[ResearchGate]]}}</ref> both introduced in 2004, have the ability to create their own laser-based [[robotic mapping|maps of a building]] and to navigate open areas as well as corridors. Their control system changes its path on the fly if something blocks the way. At first, autonomous navigation was based on planar sensors, such as laser range-finders, that can only sense at one level. The most advanced systems now fuse information from various sensors for both localization (position) and navigation. Systems such as Motivity can rely on different sensors in different areas, depending upon which provides the most reliable data at the time, and can re-map a building autonomously. Rather than climb stairs, which requires highly specialized hardware, most indoor robots navigate handicapped-accessible areas, controlling elevators and electronic doors.<ref>[http://www.ccsrobotics.com/speciminder.htm "Speci-Minder; see elevator and door access"] {{webarchive |url=https://web.archive.org/web/20080102053209/http://www.ccsrobotics.com/speciminder.htm |date=January 2, 2008 }}</ref> With such electronic access-control interfaces, robots can now freely navigate indoors. Autonomously climbing stairs and opening doors manually are topics of research at the current time. As these indoor techniques continue to develop, vacuuming robots will gain the ability to clean a specific user-specified room or a whole floor. Security robots will be able to cooperatively surround intruders and cut off exits. These advances also bring concomitant protections: robots' internal maps typically permit "forbidden areas" to be defined to prevent robots from autonomously entering certain regions. ==== Outdoor navigation ==== Outdoor autonomy is most easily achieved in the air, since obstacles are rare. [[Cruise missile]]s are rather dangerous highly autonomous robots. Pilotless drone aircraft are increasingly used for reconnaissance. Some of these [[unmanned aerial vehicle]]s (UAVs) are capable of flying their entire mission without any human interaction at all except possibly for the landing where a person intervenes using radio remote control. Some drones are capable of safe, automatic landings, however. [[SpaceX]] operates a number of [[autonomous spaceport drone ship]]s, used to safely land and recover [[Falcon 9]] rockets at sea.<ref name=nsf20141117> {{cite news |last1=Bergin|first1=Chris |title=Pad 39A β SpaceX laying the groundwork for Falcon Heavy debut |url=http://www.nasaspaceflight.com/2014/11/pad-39a-spacex-groundwork-falcon-heavy-debut/ |access-date=2014-11-17 |work=NASA Spaceflight |date=2014-11-18 }}</ref> Few countries like India started working on [https://www.skyeair.tech/ robotic deliveries] of food and other articles by [[Unmanned aerial vehicle|drone]]. Outdoor autonomy is the most difficult for ground vehicles, due to: * Three-dimensional terrain * Great disparities in surface density * Weather exigencies * Instability of the sensed environment === Open problems in autonomous robotics === {{Expand section|date=July 2008}} Several open problems in autonomous robotics are special to the field rather than being a part of the general pursuit of AI. According to George A. Bekey's ''Autonomous Robots: From Biological Inspiration to Implementation and Control'', problems include things such as making sure the robot is able to function correctly and not run into obstacles autonomously. Reinforcement learning has been used to control and plan the navigation of autonomous robots, specifically when a group of them operates in collaboration with each other.<ref name="MBK">{{Cite journal | title=Detection of Static and Mobile Targets by an Autonomous Agent with Deep Q-Learning Abilities | journal=Entropy | year=2022 | volume=24 | issue=8 | page=1168 | doi=10.3390/e24081168 | pmid=36010832 | pmc=9407070 | doi-access=free | last1=Matzliach | first1=Barouch | last2=Ben-Gal | first2=Irad | last3=Kagan | first3=Evgeny | bibcode=2022Entrp..24.1168M }}</ref> ;Energy autonomy and foraging Researchers concerned with creating true [[artificial life]] are concerned not only with intelligent control, but further with the capacity of the robot to find its own resources through [[foraging]] (looking for food, which includes both energy and spare parts). This is related to '''autonomous foraging''', a concern within the sciences of [[behavioral ecology]], [[social anthropology]], and [[human behavioral ecology]]; as well as [[robot]]ics, [[artificial intelligence]], and [[artificial life]].<ref>{{cite book|author = Kagan E., Ben-Gal, I., (2015)|format = PDF|title = Search and Foraging: Individual Motion and Swarm Dynamics (268 Pages) | date=23 June 2015 |url = https://www.amazon.com/Search-Foraging-Individual-Motion-Dynamics-ebook/dp/B010ACWAXC?ie=UTF8&*Version*=1&*entries*=0|publisher = CRC Press, Taylor and Francis }}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)