Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Autonomous robot
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{short description|Robot that performs behaviors or tasks with a high degree of autonomy}} An '''autonomous robot''' is a [[robot]] that acts without recourse to human control. Historic examples include [[space probes]]. Modern examples include self-driving [[Robotic vacuum cleaner|vacuums]] and [[Self-driving car|cars]]. [[Industrial robot|Industrial robot arms]] that work on assembly lines inside factories may also be considered autonomous robots, though their [[Agency (psychology)|autonomy]] is restricted due to a highly structured environment and their inability to [[Robot locomotion|locomote]]. == Components and criteria of robotic autonomy== {{More citations needed section|date=December 2020}} === Self-maintenance === <!-- [[Image:PeopleGuide.jpg|left|thumb|Exteroceptive sensors: 1. blue laser rangefinder senses up to 360 distance readings in a 180-degree slice; 2. 24 round golden ultrasonic sensors sample range readings in a 15-degree cone; 3. ten touch panels along the bottom detect shoes and other low-lying objects. 4. break beams between the lower and upper segments sense tables and other mid-level obstacles.]] --> The first requirement for complete physical autonomy is the ability for a robot to take care of itself. Many of the battery-powered robots on the market today can find and connect to a charging station, and some toys like Sony's ''[[Aibo]]'' are capable of self-docking to charge their batteries. Self-maintenance is based on "[[proprioception]]", or sensing one's own internal status. In the battery charging example, the robot can tell proprioceptively that its batteries are low, and it then seeks the charger. Another common proprioceptive sensor is for heat monitoring. Increased proprioception will be required for robots to work autonomously near people and in harsh environments. Common proprioceptive sensors include thermal, optical, and haptic sensing, as well as the [[Hall effect]] (electric). [[Image:MobileEyesIntel.jpg|right|thumb|Robot GUI display showing battery voltage and other proprioceptive data in lower right-hand corner. The display is for user information only. Autonomous robots monitor and respond to [[Proprioception|proprioceptive]] sensors without human intervention to keep themselves safe and operating properly.]] === Sensing the environment === Exteroception is [[robotic sensing|sensing]] things about the environment. Autonomous robots must have a range of environmental sensors to perform their task and stay out of trouble. The autonomous robot can recognize sensor failures and minimize the impact on the performance caused by failures.<ref>{{Cite journal |last=Ferrell |first=Cynthia |date=March 1994 |title=Failure Recognition and Fault Tolerance of an Autonomous Robot |url=http://dx.doi.org/10.1177/105971239400200403 |journal=Adaptive Behavior |volume=2 |issue=4 |pages=375β398 |doi=10.1177/105971239400200403 |s2cid=17611578 |issn=1059-7123}}</ref> *Common exteroceptive sensors include the [[electromagnetic spectrum]], sound, touch, chemical (smell, odor), temperature, range to various objects, and altitude. Some robotic lawn mowers will adapt their programming by detecting the speed in which grass grows as needed to maintain a perfectly cut lawn, and some vacuum cleaning robots have dirt detectors that sense how much dirt is being picked up and use this information to tell them to stay in one area longer. === Task performance === The next step in autonomous behavior is to actually perform a physical task. A new area showing commercial promise is domestic robots, with a flood of small vacuuming robots beginning with [[iRobot]] and [[Electrolux]] in 2002. While the level of intelligence is not high in these systems, they navigate over wide areas and pilot in tight situations around homes using contact and non-contact sensors. Both of these robots use proprietary algorithms to increase coverage over simple random bounce. The next level of autonomous task performance requires a robot to perform conditional tasks. For instance, security robots can be programmed to detect intruders and respond in a particular way depending upon where the intruder is. For example, [[Amazon (company)|Amazon]] launched its Astro for home monitoring, security and eldercare in September 2021.<ref>{{cite web |last1=Heater |first1=Brian |title=Why Amazon built a home robot |url=https://techcrunch.com/2021/09/28/why-amazon-built-a-home-robot/ |website=Tech Crunch |date=28 September 2021 |access-date=29 September 2021}}</ref> === Autonomous navigation === ==== Indoor navigation ==== For a robot to associate behaviors with a place ([[robot localization|localization]]) requires it to know where it is and to be able to navigate point-to-point. Such navigation began with wire-guidance in the 1970s and progressed in the early 2000s to beacon-based [[triangulation]]. Current commercial robots autonomously navigate based on sensing natural features. The first commercial robots to achieve this were Pyxus' HelpMate hospital robot and the CyberMotion guard robot, both designed by robotics pioneers in the 1980s. These robots originally used manually created [[Computer-aided design|CAD]] floor plans, sonar sensing and wall-following variations to navigate buildings. The next generation, such as MobileRobots' [[PatrolBot]] and autonomous wheelchair,<ref>{{cite web |url=https://www.researchgate.net/publication/236882346 |title=Autonomous Wheelchair: Concept and Exploration |first1=Rafael |last1=Berkvens |first2=Wouter |last2=Rymenants |first3=Maarten |last3=Weyn |first4=Simon |last4=Sleutel |first5=Willy |last5=Loockx |work=AMBIENT 2012 : The Second International Conference on Ambient Computing, Applications, Services and Technologies |via=[[ResearchGate]]}}</ref> both introduced in 2004, have the ability to create their own laser-based [[robotic mapping|maps of a building]] and to navigate open areas as well as corridors. Their control system changes its path on the fly if something blocks the way. At first, autonomous navigation was based on planar sensors, such as laser range-finders, that can only sense at one level. The most advanced systems now fuse information from various sensors for both localization (position) and navigation. Systems such as Motivity can rely on different sensors in different areas, depending upon which provides the most reliable data at the time, and can re-map a building autonomously. Rather than climb stairs, which requires highly specialized hardware, most indoor robots navigate handicapped-accessible areas, controlling elevators and electronic doors.<ref>[http://www.ccsrobotics.com/speciminder.htm "Speci-Minder; see elevator and door access"] {{webarchive |url=https://web.archive.org/web/20080102053209/http://www.ccsrobotics.com/speciminder.htm |date=January 2, 2008 }}</ref> With such electronic access-control interfaces, robots can now freely navigate indoors. Autonomously climbing stairs and opening doors manually are topics of research at the current time. As these indoor techniques continue to develop, vacuuming robots will gain the ability to clean a specific user-specified room or a whole floor. Security robots will be able to cooperatively surround intruders and cut off exits. These advances also bring concomitant protections: robots' internal maps typically permit "forbidden areas" to be defined to prevent robots from autonomously entering certain regions. ==== Outdoor navigation ==== Outdoor autonomy is most easily achieved in the air, since obstacles are rare. [[Cruise missile]]s are rather dangerous highly autonomous robots. Pilotless drone aircraft are increasingly used for reconnaissance. Some of these [[unmanned aerial vehicle]]s (UAVs) are capable of flying their entire mission without any human interaction at all except possibly for the landing where a person intervenes using radio remote control. Some drones are capable of safe, automatic landings, however. [[SpaceX]] operates a number of [[autonomous spaceport drone ship]]s, used to safely land and recover [[Falcon 9]] rockets at sea.<ref name=nsf20141117> {{cite news |last1=Bergin|first1=Chris |title=Pad 39A β SpaceX laying the groundwork for Falcon Heavy debut |url=http://www.nasaspaceflight.com/2014/11/pad-39a-spacex-groundwork-falcon-heavy-debut/ |access-date=2014-11-17 |work=NASA Spaceflight |date=2014-11-18 }}</ref> Few countries like India started working on [https://www.skyeair.tech/ robotic deliveries] of food and other articles by [[Unmanned aerial vehicle|drone]]. Outdoor autonomy is the most difficult for ground vehicles, due to: * Three-dimensional terrain * Great disparities in surface density * Weather exigencies * Instability of the sensed environment === Open problems in autonomous robotics === {{Expand section|date=July 2008}} Several open problems in autonomous robotics are special to the field rather than being a part of the general pursuit of AI. According to George A. Bekey's ''Autonomous Robots: From Biological Inspiration to Implementation and Control'', problems include things such as making sure the robot is able to function correctly and not run into obstacles autonomously. Reinforcement learning has been used to control and plan the navigation of autonomous robots, specifically when a group of them operates in collaboration with each other.<ref name="MBK">{{Cite journal | title=Detection of Static and Mobile Targets by an Autonomous Agent with Deep Q-Learning Abilities | journal=Entropy | year=2022 | volume=24 | issue=8 | page=1168 | doi=10.3390/e24081168 | pmid=36010832 | pmc=9407070 | doi-access=free | last1=Matzliach | first1=Barouch | last2=Ben-Gal | first2=Irad | last3=Kagan | first3=Evgeny | bibcode=2022Entrp..24.1168M }}</ref> ;Energy autonomy and foraging Researchers concerned with creating true [[artificial life]] are concerned not only with intelligent control, but further with the capacity of the robot to find its own resources through [[foraging]] (looking for food, which includes both energy and spare parts). This is related to '''autonomous foraging''', a concern within the sciences of [[behavioral ecology]], [[social anthropology]], and [[human behavioral ecology]]; as well as [[robot]]ics, [[artificial intelligence]], and [[artificial life]].<ref>{{cite book|author = Kagan E., Ben-Gal, I., (2015)|format = PDF|title = Search and Foraging: Individual Motion and Swarm Dynamics (268 Pages) | date=23 June 2015 |url = https://www.amazon.com/Search-Foraging-Individual-Motion-Dynamics-ebook/dp/B010ACWAXC?ie=UTF8&*Version*=1&*entries*=0|publisher = CRC Press, Taylor and Francis }}</ref> ==Societal impact and issues== As autonomous robots have grown in ability and technical levels, there has been increasing societal awareness and news coverage of the latest advances, and also some of the philosophical issues, economic effects, and societal impacts that arise from the roles and activities of autonomous robots. Elon Musk, a prominent business executive and billionaire has warned for years of the possible hazards and pitfalls of autonomous robots; however, his own company is one of the most prominent companies that is trying to devise new advanced technologies in this area.<ref>[https://www.cnbc.com/2021/08/24/elon-musk-warned-of-ai-apocalypsenow-hes-building-a-tesla-robot.html Elon Musk warned of a βTerminatorβ-like AI apocalypse β now heβs building a Tesla robot], Tue, Aug 24 2021, Brandon Gomez, cnbc.com</ref> In 2021, a United Nations group of government experts, known as the ''Convention on Certain Conventional Weapons β Group of Governmental Experts on Lethal Autonomous Weapons Systems'', held a conference to highlight the ethical concerns which arise from the increasingly advanced technology for autonomous robots to wield weapons and to play a military role.<ref>[https://undocs.org/ccw/gge.1/2021/1 Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects], July 14, 2021, UN Official website at undocs.org.</ref> ==Technical development== ===Early robots=== The first autonomous robots were known as [[Elmer and Elsie (robots)|Elmer and Elsie]], constructed in the late 1940s by [[William Grey Walter|W. Grey Walter]]. They were the first [[robots]] programmed to "think" the way biological brains do and were meant to have [[free will]].<ref name=IngalisArkel>Ingalis-Arkell, Esther [https://io9.gizmodo.com/5890771/the-very-first-robot-brains-were-made-of-old-alarm-clocks "The Very First Robot Brains Were Made of Old Alarm Clocks"] {{Webarchive|url=https://web.archive.org/web/20180908015719/https://io9.gizmodo.com/5890771/the-very-first-robot-brains-were-made-of-old-alarm-clocks |date=2018-09-08 }}, 7 March 2012.</ref> Elmer and Elsie were often labeled as tortoises because of how they were shaped and the manner in which they moved. They were capable of [[phototaxis]], the movement that occurs in response to light stimulus.<ref name=NormaJeremy>[Norman, Jeremy, [http://www.historyofinformation.com/expanded.php?id=854 "The First Electronic Autonomous Robots: the Origin of Social Robotics (1948 β 1949)"], Jeremy Norman & Co., Inc., 02004-2018.</ref> ===Space probes=== The Mars rovers [[MER-A]] and [[MER-B]] (now known as [[Spirit rover|''Spirit'' rover]] and [[Opportunity rover|''Opportunity'' rover]]) found the position of the Sun and navigated their own routes to destinations, on the fly, by: * Mapping the surface with 3D vision * Computing safe and unsafe areas on the surface within that field of vision * Computing optimal paths across the safe area towards the desired destination * Driving along the calculated route * Repeating this cycle until either the destination is reached, or there is no known path to the destination The planned [[ESA]] Rover, [[Rosalind Franklin (rover)|''Rosalind Franklin'' rover]], is capable of vision based relative localisation and absolute localisation to autonomously navigate safe and efficient trajectories to targets by: * [[3D reconstruction|Reconstructing 3D models]] of the terrain surrounding the Rover using a pair of stereo cameras * Determining safe and unsafe areas of the terrain and the general "difficulty" for the Rover to navigate the terrain * Computing efficient paths across the safe area towards the desired destination * Driving the Rover along the planned path * Building up a navigation map of all previous navigation data During the final NASA Sample Return Robot Centennial Challenge in 2016, a rover, named Cataglyphis, successfully demonstrated fully autonomous navigation, decision-making, and sample detection, retrieval, and return capabilities.<ref>{{Cite web|url=https://www.nasa.gov/directorates/spacetech/centennial_challenges/feature/2016_sample_return_robot_challenge_award.html|title=NASA Awards $750K in Sample Return Robot Challenge|last=Hall|first=Loura|date=2016-09-08|access-date=2016-09-17}}</ref> The rover relied on a fusion of measurements from [[inertial sensor]]s, wheel encoders, Lidar, and camera for navigation and mapping, instead of using GPS or magnetometers. During the 2-hour challenge, Cataglyphis traversed over 2.6 km and returned five different samples to its starting position. ===General-use autonomous robots=== [[Image:smSeekurMDARS.jpg|thumb|The Seekur and MDARS robots demonstrate their autonomous navigation and security capabilities at an airbase.]] [[Image: Sophia at the AI for Good Global Summit 2018 (27254369347) (cropped).jpg|thumb|Sophia, a robot known for human-like appearance and interactions]] [[File:Seyiton-AMR.jpg|thumb| AMR transfer cart for in-factory transfer needs]] The Seekur robot was the first commercially available robot to demonstrate MDARS-like capabilities for general use by airports, utility plants, corrections facilities and [[Homeland Security]].<ref>[https://www.foxnews.com/story/weapons-makers-unveil-new-era-of-counter-terror-equipment "Weapons Makers Unveil New Era of Counter-Terror Equipment"], Fox News</ref> The [[DARPA Grand Challenge]] and [[DARPA Urban Challenge]] have encouraged development of even more autonomous capabilities for ground vehicles, while this has been the demonstrated goal for aerial robots since 1990 as part of the AUVSI [[International Aerial Robotics Competition]]. AMR transfer carts developed by Seyiton are used to transfer loads of up to 1500 kilograms inside factories. <ref>[https://seyiton.com/en/autonomous-mobile-robot-amr/ " Autonomous Mobile Robots (AMR)"], Seyiton</ref> Between 2013 and 2017, [[TotalEnergies]] has held the [[ARGOS Challenge]] to develop the first autonomous robot for oil and gas production sites. The robots had to face adverse outdoor conditions such as rain, wind and extreme temperatures.<ref>{{cite web|title=Enhanced Safety Thanks to the ARGOS Challenge|url=http://www.total.com/en/media/news/news/enhanced-safety-thanks-argos-challenge?folder=7692|website=Total Website|access-date=13 May 2017|archive-date=16 January 2018|archive-url=https://web.archive.org/web/20180116141041/https://www.total.com/en/media/news/news/enhanced-safety-thanks-argos-challenge?folder=7692|url-status=dead}}</ref> Some significant current robots include: * [[Sophia (robot)|Sophia]] is an autonomous robot<ref name="wired">{{cite magazine|url=https://www.wired.com/story/photographing-a-robot/|title=Photographing a robot isn't just point and shoot|magazine=Wired|date=March 29, 2018|access-date=October 10, 2018|archive-date=December 25, 2018|archive-url=https://web.archive.org/web/20181225204516/https://www.wired.com/story/photographing-a-robot/|url-status=live}}</ref><ref>{{cite web|url=http://www.hansonrobotics.com/robot/sophia/|title=Hanson Robotics Sophia|work=Hanson Robotics|access-date=October 26, 2017|archive-date=November 19, 2017|archive-url=https://web.archive.org/web/20171119013425/http://www.hansonrobotics.com/robot/sophia/|url-status=live}}</ref> that is known for its human-like appearance and behavior compared to previous robotic variants. As of 2018, Sophia's architecture includes scripting software, a chat system, and [[OpenCog]], an AI system designed for general reasoning.<ref>{{cite news |title=The complicated truth about Sophia the robot β an almost human robot or a PR stunt |url=https://www.cnbc.com/2018/06/05/hanson-robotics-sophia-the-robot-pr-stunt-artificial-intelligence.html |access-date=17 May 2020 |work=CNBC |date=5 June 2018 |archive-date=May 12, 2020 |archive-url=https://web.archive.org/web/20200512030753/https://www.cnbc.com/2018/06/05/hanson-robotics-sophia-the-robot-pr-stunt-artificial-intelligence.html |url-status=live }}</ref> Sophia imitates human gestures and facial expressions and is able to answer certain questions and to make simple conversations on predefined topics (e.g. on the weather).<ref>{{cite web|url=http://www.hansonrobotics.com/news/|title=Hanson Robotics in the news|work=Hanson Robotics|access-date=October 26, 2017|archive-date=November 12, 2017|archive-url=https://web.archive.org/web/20171112111735/http://www.hansonrobotics.com/news|url-status=live}}</ref> The AI program analyses conversations and extracts data that allows it to improve responses in the future.<ref name="cbs">{{cite news|url=https://www.cbsnews.com/news/60-minutes-charlie-rose-interviews-a-robot-sophia/|title=Charlie Rose interviews ... a robot?|work=CBS 60 Minutes|date=June 25, 2017|access-date=October 28, 2017|archive-date=October 29, 2017|archive-url=https://web.archive.org/web/20171029013415/https://www.cbsnews.com/news/60-minutes-charlie-rose-interviews-a-robot-sophia/|url-status=live}}</ref> * Nine other robot humanoid "siblings" who were also created by [[Hanson Robotics]].<ref name="auto2">{{Cite news|url=http://www.businessinsider.com/sophia-robot-hanson-robotics-other-humanoids-2017-11|title=The first-ever robot citizen has 7 humanoid 'siblings' β here's what they look like|work=Business Insider|access-date=January 4, 2018|archive-date=January 4, 2018|archive-url=https://web.archive.org/web/20180104030409/http://www.businessinsider.com/sophia-robot-hanson-robotics-other-humanoids-2017-11|url-status=live}}</ref> Fellow Hanson robots are Alice, [[Albert HUBO|Albert Einstein Hubo]], [[BINA48]], Han, Jules, Professor Einstein, Philip K. Dick Android, Zeno,<ref name="auto2"/> and Joey Chaos.<ref>{{Cite news|url=https://gizmodo.com/263573/joey-the-rocker-robot-more-conscious-than-some-humans|title=Joey the Rocker Robot, More Conscious Than Some Humans|last=White|first=Charlie|work=Gizmodo|access-date=January 4, 2018|archive-date=December 22, 2017|archive-url=https://web.archive.org/web/20171222052524/https://gizmodo.com/263573/joey-the-rocker-robot-more-conscious-than-some-humans|url-status=live}}</ref> Around 2019β20, Hanson released "Little Sophia" as a companion that could teach children how to code, including support for Python, Blockly, and Raspberry Pi.<ref>{{Cite news|title=Hanson Robotics debuts Little Sophia, a robot companion that teaches kids to code|url=https://venturebeat.com/2019/01/30/hanson-robotics-debuts-little-sophia-a-robot-companion-that-teaches-kids-how-to-code/|last=Wiggers|first=Kyle|date=January 30, 2019|access-date=April 2, 2020|work=[[VentureBeat]]|archive-date=August 9, 2020|archive-url=https://web.archive.org/web/20200809213506/https://venturebeat.com/2019/01/30/hanson-robotics-debuts-little-sophia-a-robot-companion-that-teaches-kids-how-to-code/|url-status=live}}</ref> ===Military autonomous robots=== [[Lethal autonomous weapon]]s (LAWs) are a type of autonomous robot [[military robot|military system]] that can independently search for and engage targets based on programmed constraints and descriptions.<ref name=":1">{{cite journal|last=Crootof|first=Rebecca|date=2015|title=The Killer Robots Are Here: Legal and Policy Implications|url=https://heinonline.org/HOL/Page?collection=journals&handle=hein.journals/cdozo36&id=1943|journal=Cardozo L. Rev. |volume=36 |pages=1837|via=heinonline.org}}</ref> LAWs are also known as lethal autonomous weapon systems (LAWS), autonomous weapon systems (AWS), robotic weapons, killer robots or slaughterbots.<ref>{{cite web |last=Johnson |first=Khari |title=Andrew Yang warns against 'slaughterbots' and urges global ban on autonomous weaponry |url=https://venturebeat.com/2020/01/31/andrew-yang-warns-against-slaughterbots-and-urges-global-ban-on-autonomous-weaponry/ |website=venturebeat.com |publisher=[[VentureBeat]] |date=31 January 2020 |access-date=31 January 2020}}</ref> LAWs may operate in the air, on land, on water, under water, or in space. The autonomy of current systems {{as of|2018|lc=y}} was restricted in the sense that a human gives the final command to attack β though there are exceptions with certain "defensive" systems. *UGV Interoperability Profile (UGV IOP), Robotics and Autonomous Systems β Ground IOP (RAS-G IOP), was originally a research program started by the [[United States Department of Defense|United States Department of Defense (DoD)]] to organize and maintain [[open architecture]] [[interoperability]] standards for [[Unmanned Ground Vehicles|Unmanned Ground Vehicles (UGV)]].<ref name="IOPv2">{{cite book|title=Robotics and Autonomous Systems β Ground (RAS-G) Interoperability Profile (IOP)|date=2016|publisher=US Army Project Manager, Force Projection (PM FP)|location=Warren, Michigan, USA|edition=Version 2.0|url=https://namcgroups.org/|ref=IOPv2}}</ref><ref name="aviation2012">{{cite news|title=U.S. Army Unveils Common UGV Standards|url=http://aviationweek.com/awin/us-army-unveils-common-ugv-standards|access-date=25 April 2017|work=Aviation Week Network|publisher=Penton|date=10 January 2012}}</ref><ref name="fnr2014">{{cite news|last1=Serbu|first1=Jared|title=Army turns to open architecture to plot its future in robotics|url=https://federalnewsradio.com/defense/2014/08/army-turns-to-open-architecture-to-plot-its-future-in-robotics/|access-date=28 April 2017|work=Federal News Radio|date=14 August 2014|ref=fnr2014}}</ref><ref name="robolliance">{{cite news |last1=Demaitre |first1=Eugene |title=Military Robots Use Interoperability Profile for Mobile Arms |work= |agency=Robotics Business Review |url=https://www.roboticsbusinessreview.com/supply-chain/military-robots-use-interoperability-profile-mobile-arms/ |archive-url=https://web.archive.org/web/20200814120445/https://www.roboticsbusinessreview.com/supply-chain/military-robots-use-interoperability-profile-mobile-arms/ |url-status=dead |archive-date=August 14, 2020 |access-date=14 July 2016 }}</ref> The IOP was initially created by U.S. Army Robotic Systems Joint Project Office (RS JPO):<ref>{{cite web|last1=Mazzara|first1=Mark|title=RS JPO Interoperability Profiles|url=http://www.dtic.mil/cgi-bin/GetTRDoc?AD=ADA548099|publisher=U.S. Army RS JPO|access-date=20 March 2017|ref=iop2011|location=Warren, Michigan|date=2011}}{{dead link|date=June 2022|bot=medic}}{{cbignore|bot=medic}}</ref><ref name="iop2014">{{cite web|last1=Mazzara|first1=Mark|title=UGV Interoperability Profiles (IOPs) Update for GVSETS|url=http://ww2.esd.org/gvsets/pdf/ags/1500mazzara_skalny.pdf|publisher=U.S. Army PM FP|access-date=20 March 2017|location=Warren, Michigan|date=2014|ref=iop2014}}{{Dead link|date=October 2022 |bot=InternetArchiveBot |fix-attempted=yes }}</ref><ref name="rbr2016">{{cite news|last1=Demaitre|first1=Eugene|title=Military Robots Use Interoperability Profile for Mobile Arms|url=https://www.roboticsbusinessreview.com/security/military-robots-use-interoperability-profile-mobile-arms/|access-date=28 April 2017|work=Robotics Business Review|publisher=EH Publishing|date=14 July 2016|ref=rbr2016}}{{Dead link|date=January 2024 |bot=InternetArchiveBot |fix-attempted=yes }}</ref> * In October 2019, Textron and Howe & Howe unveiled their [[Ripsaw (vehicle)|Ripsaw]] M5 vehicle,<ref>[https://breakingdefense.com/2019/10/textron-rolls-out-ripsaw-robot-for-rcv-light-and-rcv-medium/ Textron Rolls Out Ripsaw Robot For RCV-Light β¦ And RCV-Medium]. ''Breaking Defense''. 14 October 2019.</ref> and on 9 January 2020, the U.S. Army awarded them a contract for the Robotic Combat Vehicle-Medium (RCV-M) program. Four Ripsaw M5 prototypes are to be delivered and used in a [[company (military unit)|company]]-level to determine the feasibility of integrating unmanned vehicles into ground combat operations in late 2021.<ref>[https://www.defensenews.com/land/2020/01/09/army-picks-winners-to-build-light-and-medium-robotic-combat-vehicles/ US Army picks winners to build light and medium robotic combat vehicles]. ''[[Defense News]]''. 9 January 2020.</ref><ref>[https://www.army.mil/article/231572/gvsc_ngcv_cft_announces_rcv_light_and_medium_award_selections GVSC, NGCV CFT announces RCV Light and Medium award selections]. ''Army.mil''. 10 January 2020.</ref><ref>[https://www.military.com/daily-news/2020/01/14/army-picks-2-firms-build-light-and-medium-robotic-combat-vehicles.html Army Picks 2 Firms to Build Light and Medium Robotic Combat Vehicles]. ''[[Military.com]]''. 14 January 2020.</ref> It can reach speeds of more than {{cvt|40|mph|abbr=on}}, has a combat weight of 10.5 tons and a payload capacity of {{cvt|8000|lb|abbr=on}}.<ref>[https://www.nationaldefensemagazine.org/articles/2020/4/10/army-setting-stage-for-new-unmanned-platforms Army Setting Stage for New Unmanned Platforms]. ''National Defense Magazine''. 10 April 2020.</ref> The RCV-M is armed with a [[Mk44 Bushmaster II|30 mm autocannon]] and a pair of [[anti-tank missile]]s. The standard armor package can withstand [[12.7Γ108mm]] rounds, with optional add-on armor increasing weight to up to 20 tons. If disabled, it will retain the ability to shoot, with its sensors and radio uplink prioritized to continue transmitting as its primary function.<ref>[https://breakingdefense.com/2020/11/meet-the-armys-future-family-of-robot-tanks-rcv/ Meet The Armyβs Future Family Of Robot Tanks: RCV]. ''Breaking Defense''. 9 November 2020.</ref> * Crusher is a {{convert|13200|lb|kg|adj=on}}<ref name="brochure">{{cite press release|title=UPI: UGCV PerceptOR Integration|publisher=Carnegie Mellon University|url=http://www.rec.ri.cmu.edu/projects/crusher/Crusher_Brochure.pdf|access-date=18 November 2010|archive-url=https://web.archive.org/web/20131216022023/http://www.rec.ri.cmu.edu/projects/crusher/Crusher_Brochure.pdf|archive-date=16 December 2013|url-status=dead}}</ref> [[wikt:autonomous|autonomous]] off-road [[Unmanned Ground Combat Vehicle]] developed by researchers at the [[Carnegie Mellon University]]'s [[National Robotics Engineering Center]] for [[DARPA]].<ref name="cmu_nrec">{{cite press release|title=Carnegie Mellon's National Robotics Engineering Center Unveils Futuristic Unmanned Ground Combat Vehicles|publisher=Carnegie Mellon University|date=April 28, 2006|url=http://www.rec.ri.cmu.edu/projects/crusher/Crusher_Press_Release.pdf|access-date=18 November 2010|archive-url=https://web.archive.org/web/20100922225949/http://www.rec.ri.cmu.edu/projects/crusher/Crusher_Press_Release.pdf|archive-date=22 September 2010|url-status=dead}}</ref> It is a follow-up on the previous Spinner vehicle.<ref name="DARPA-press">{{cite press release|title=Crusher Unmanned Ground Combat Vehicle Unveiled|publisher=Defense Advanced Research Projects Agency|date=April 28, 2006|url=http://www.rec.ri.cmu.edu/projects/crusher/Crusher_Press_Release_DARPA.pdf|access-date=18 November 2010|archive-url=https://web.archive.org/web/20110112072447/http://www.rec.ri.cmu.edu/projects/crusher/Crusher_Press_Release_DARPA.pdf|archive-date=12 January 2011|url-status=dead}}</ref> DARPA's technical name for the Crusher is ''Unmanned Ground Combat Vehicle and Perceptor Integration System'',<ref name="sharkey">{{cite journal|last=Sharkey|first=Noel|title=Grounds for Discrimination: Autonomous Robot Weapons|journal=RUSI: Challenges of Autonomous Weapons|pages=87|url=http://www.rusi.org/downloads/assets/23sharkey.pdf|access-date=18 November 2010|url-status=dead|archive-url=https://web.archive.org/web/20110928105846/http://www.rusi.org/downloads/assets/23sharkey.pdf|archive-date=28 September 2011}}</ref> and the whole project is known by the acronym UPI, which stands for ''Unmanned Ground Combat Vehicle PerceptOR Integration''.<ref name=cmu_nrec /> * [[CATS Warrior]] will be an autonomous wingman drone capable of take-off & landing from land & in sea from an [[aircraft carrier]], it will team up with the existing fighter platforms of the [[Indian Air Force|IAF]] like [[HAL Tejas|Tejas]], [[sukhoi Su-30MKI|Su-30 MKI]] and [[SEPECAT Jaguar|Jaguar]] which will act like its mothership.<ref name="indiatoday">{{cite web|url=https://www.indiatoday.in/india/story/india-gears-up-for-unmanned-warfare-helicopter-drones-cats-warrior-ration-delivery-ladakh-troops-1766009-2021-02-04|title=Strikes from 700km away to drones replacing mules for ration at 15,000ft, India gears up for unmanned warfare β India News|website=indiatoday.in|date=4 February 2021 |access-date=22 February 2021}}</ref> * The Warrior is primarily envisioned for the Indian Air Force use and a similar, smaller version will be designed for the [[Indian Navy]]. It would be controlled by the mothership and accomplish tasks such as scouting, absorbing enemy fire, attacking the targets if necessary with its internal & external pylons weapons or sacrifice itself by crashing into the target. * The SGR-A1 is a type of autonomous [[sentry gun]] that was jointly developed by [[Samsung Techwin]] (now [[Hanwha Aerospace]]) and [[Korea University]] to assist South Korean troops in the [[Korean Demilitarized Zone]]. It is widely considered as the first unit of its kind to have an integrated system that includes surveillance, tracking, firing, and voice recognition.<ref name=":1e">{{cite web|url=https://spectrum.ieee.org/a-robotic-sentry-for-koreas-demilitarized-zone|title=A Robotic Sentry For Korea's Demilitarized Zone|last=Kumagai|first=Jean|date=March 1, 2007|website=|publisher=IEEE Spectrum|access-date=}}</ref> While units of the SGR-A1 have been reportedly deployed, their number is unknown due to the project being "highly classified".<ref>{{cite web|url=http://www.stripes.com/news/pacific/korea/machine-gun-toting-robots-deployed-on-dmz-1.110809|title=Machine Gun Toting Robots Deployed On DMZ|last=Rabiroff|first=Jon|date=July 12, 2010|website=|publisher=Stars and Stripes|access-date=|archive-date=April 6, 2018|archive-url=https://web.archive.org/web/20180406040642/https://www.stripes.com/news/pacific/korea/machine-gun-toting-robots-deployed-on-dmz-1.110809|url-status=dead}}</ref> ==Types of robots== ===Humanoid=== [[Tesla Robot]] and [[NVIDIA GR00T]] are humanoid robots. Humanoids are machines that are designed to mimic the human form in appearance and behavior. These robots typically have a head, torso, arms, and legs, making them look like humans. ===Delivery robot=== {{Main|Delivery robot}} {{See also|Delivery drone}} [[File:Food delivery bot at Yangfang Shengli Original Restaurant (20200111163318).jpg|thumb|A food delivery robot]] A delivery robot is an autonomous robot used for delivering goods. === Charging robot === An automatic charging robot, unveiled on July 27, 2022, is an arm-shaped robot to charge an electric vehicle. It has been running a pilot operation at Hyundai Motor Group's headquarters since 2021. VISION AI System based on deep learning technology has been applied. When an electric vehicle is parked in front of the charger, the robot arm recognizes the charger of the electric vehicle and derives coordinates. And automatically insert a connector into the electric car and operate fast charging. The robot arm is configured in a vertical multi-joint structure so that it can be applied to chargers at different locations for each vehicle. In addition, waterproof and dustproof functions are applied.<ref>{{Cite journal |first= |date=August 2, 2022 |title=Robotics Lifestyle Innovation Brought by Robots |url=https://tech.hyundaimotorgroup.com/convergence/robotics/ |journal=HyundaiMotorGroup Tech |access-date=August 3, 2022 |archive-date=August 3, 2022 |archive-url=https://web.archive.org/web/20220803095844/https://tech.hyundaimotorgroup.com/convergence/robotics/ |url-status=dead }}</ref> ===Construction robots=== Construction robots are used directly on job sites and perform work such as building, material handling, earthmoving, and surveillance. ===Research and education mobile robots=== Research and education mobile robots are mainly used during a prototyping phase in the process of building full scale robots. They are a scaled down version of bigger robots with the same types of sensors, [[robot kinematics|kinematics]] and software stack (e.g. ROS). They are often extendable and provide comfortable programming interface and development tools. Next to full scale robot prototyping they are also used for education, especially at university level, where more and more labs about programming autonomous vehicles are being introduced. ==Legislation== In March 2016, a bill was introduced in Washington, D.C., allowing pilot ground robotic deliveries.<ref>{{cite web|title=B21-0673 β Personal Delivery Device Act of 2016|url=http://lims.dccouncil.us/Legislation/B21-0673}}</ref> The program was to take place from September 15 through the end of December 2017. The robots were limited to a weight of 50 pounds unloaded and a maximum speed of 10 miles per hour. In case the robot stopped moving because of malfunction the company was required to remove it from the streets within 24 hours. There were allowed only 5 robots to be tested per company at a time.<ref>{{cite web|url=https://www.washingtonpost.com/news/the-switch/wp/2016/06/24/its-official-drone-delivery-is-coming-to-d-c-in-september/|title=It's official: Drone delivery is coming to D.C. in September|first=Brian|last=Fung|date=24 June 2016|via=www.washingtonpost.com}}</ref> A 2017 version of the Personal Delivery Device Act bill was under review as of March 2017.<ref>{{cite web|title=B22-0019 β Personal Delivery Device Act of 2017|url=http://lims.dccouncil.us/Legislation/B22-0019}}</ref> In February 2017, a bill was passed in the US state of [[Virginia]] via the House bill, HB2016,<ref>{{cite web| url = https://lis.virginia.gov/cgi-bin/legp604.exe?ses=171&typ=bil&val=HB2016| title = HB 2016 Electric personal delivery devices; operation on sidewalks and shared-use paths.}}</ref> and the Senate bill, SB1207,<ref>{{cite web| url = https://lis.virginia.gov/cgi-bin/legp604.exe?ses=171&typ=bil&val=SB1207| title = SB 1207 Electric personal delivery devices; operation on sidewalks and shared-use paths.}}</ref> that will allow autonomous delivery robots to travel on sidewalks and use crosswalks statewide beginning on July 1, 2017. The robots will be limited to a maximum speed of 10 mph and a maximum weight of 50 pounds.<ref>{{cite web|url=https://www.recode.net/2017/3/1/14782518/virginia-robot-law-first-state-delivery-starship|title=Virginia is the first state to pass a law allowing robots to deliver straight to your door|date=March 2017}}</ref> In the states of Idaho and Florida there are also talks about passing the similar legislature.<ref>{{cite web|url=http://www.ktvb.com/news/local/capitol-watch/bill-allowing-robots-to-make-deliveries-heads-to-idaho-house/416300359|title=Could delivery robots be on their way to Idaho?|access-date=2017-03-02|archive-date=2017-03-03|archive-url=https://web.archive.org/web/20170303124238/http://www.ktvb.com/news/local/capitol-watch/bill-allowing-robots-to-make-deliveries-heads-to-idaho-house/416300359|url-status=dead}}</ref><ref>[http://www.bizjournals.com/tampabay/news/2017/01/25/florida-senator-proposes-rules-for-tiny-personal.html Florida senator proposes rules for tiny personal delivery robots ] January 25, 2017</ref> It has been discussed{{by whom|date=April 2019}} that robots with similar characteristics to invalid carriages (e.g. 10 mph maximum, limited battery life) might be a workaround for certain classes of applications. If the robot was sufficiently intelligent and able to recharge itself using the existing electric vehicle (EV) charging infrastructure it would only need minimal supervision and a single arm with low dexterity might be enough to enable this function if its visual systems had enough resolution.{{citation needed|date=April 2019}} In November 2017, the San Francisco Board of Supervisors announced that companies would need to get a city permit in order to test these robots.<ref>{{Cite magazine|url=https://www.wired.com/story/san-francisco-just-put-the-brakes-on-delivery-robots/|title=San Francisco Just Put the Brakes on Delivery Robots|last=Simon|first=Matt|date=6 December 2017|magazine=[[Wired (magazine)|Wired]]|access-date=6 December 2017}}</ref> In addition, the Board banned sidewalk delivery robots from making non-research deliveries.<ref>{{Cite news|url=https://sf.curbed.com/2017/12/6/16743326/san-francisco-delivery-robot-ban|title=San Francisco bans robots from most sidewalks|last=Brinklow|first=Adam|date=6 December 2017|work=[[Curbed]]|access-date=6 December 2017}}</ref> == See also == ===Scientific concepts=== * [[Artificial intelligence]] * [[Cognitive robotics]] * [[Developmental robotics]] * [[Evolutionary robotics]] * [[Simultaneous localization and mapping]] * [[Teleoperation]] * [[Von Neumann architecture|von Neumann machine]] * [[Wake-up robot problem]] * [[William Grey Walter]] ===Types of robots=== * [[Autonomous car]] *[[Autonomous research robot]] * [[Autonomous spaceport drone ship]] * [[Domestic robot]] * [[Humanoid robot]] ===Specific robot models=== * [[AIBO]] * [[Amazon Scout]] * [[Microbotics]] * [[PatrolBot]] * [[RoboBee]] * [[Robomow]] ===Others=== * [[Remote-control vehicle]] * [[Robot control]] ==References== {{reflist}} ==External links== * {{Commons category-inline|Autonomous robots}} {{Mobile robots}} {{Authority control}} [[Category:Robots|-]] [[Category:Uncrewed vehicles]] [[Category:Self-replication]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:As of
(
edit
)
Template:Authority control
(
edit
)
Template:By whom
(
edit
)
Template:Cbignore
(
edit
)
Template:Citation needed
(
edit
)
Template:Cite book
(
edit
)
Template:Cite journal
(
edit
)
Template:Cite magazine
(
edit
)
Template:Cite news
(
edit
)
Template:Cite press release
(
edit
)
Template:Cite web
(
edit
)
Template:Commons category-inline
(
edit
)
Template:Convert
(
edit
)
Template:Cvt
(
edit
)
Template:Dead link
(
edit
)
Template:Expand section
(
edit
)
Template:Main
(
edit
)
Template:Mobile robots
(
edit
)
Template:More citations needed section
(
edit
)
Template:Reflist
(
edit
)
Template:See also
(
edit
)
Template:Short description
(
edit
)
Template:Webarchive
(
edit
)