Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Computer vision
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Tactile feedback === [[File:Finger sensor.webp|thumb|upright=1.15|Rubber artificial skin layer with the flexible structure for the shape estimation of micro-undulation surfaces]] [[File:Silicon Sensor.webp|thumb|upright=1.15|Above is a silicon mold with a camera inside containing many different point markers. When this sensor is pressed against the surface, the silicon deforms, and the position of the point markers shifts. A computer can then take this data and determine how exactly the mold is pressed against the surface. This can be used to calibrate robotic hands in order to make sure they can grasp objects effectively.]] Materials such as rubber and silicon are being used to create sensors that allow for applications such as detecting microundulations and calibrating robotic hands. Rubber can be used in order to create a mold that can be placed over a finger, inside of this mold would be multiple strain gauges. The finger mold and sensors could then be placed on top of a small sheet of rubber containing an array of rubber pins. A user can then wear the finger mold and trace a surface. A computer can then read the data from the strain gauges and measure if one or more of the pins are being pushed upward. If a pin is being pushed upward then the computer can recognize this as an imperfection in the surface. This sort of technology is useful in order to receive accurate data on imperfections on a very large surface.<ref name=":0">{{Cite journal|last1=Ando|first1=Mitsuhito|last2=Takei|first2=Toshinobu|last3=Mochiyama|first3=Hiromi|date=2020-03-03|title=Rubber artificial skin layer with flexible structure for shape estimation of micro-undulation surfaces|journal=ROBOMECH Journal|volume=7|issue=1|pages=11|doi=10.1186/s40648-020-00159-0|issn=2197-4225|doi-access=free}}</ref> Another variation of this finger mold sensor are sensors that contain a camera suspended in silicon. The silicon forms a dome around the outside of the camera and embedded in the silicon are point markers that are equally spaced. These cameras can then be placed on devices such as robotic hands in order to allow the computer to receive highly accurate tactile data.<ref name=":1">{{Cite journal|last1=Choi|first1=Seung-hyun|last2=Tahara|first2=Kenji|date=2020-03-12|title=Dexterous object manipulation by a multi-fingered robotic hand with visual-tactile fingertip sensors|journal=ROBOMECH Journal|volume=7|issue=1|pages=14|doi=10.1186/s40648-020-00162-5|issn=2197-4225|doi-access=free}}</ref> Other application areas include: * Support of [[visual effects]] creation for cinema and broadcast, ''e.g.'', [[camera tracking]] (match moving). * [[Surveillance]]. * [[Driver drowsiness detection]]<ref>{{Cite book |last=Garg |first=Hitendra |title=2020 International Conference on Power Electronics & IoT Applications in Renewable Energy and its Control (PARC) |chapter=Drowsiness Detection of a Driver using Conventional Computer Vision Application |date=2020-02-29 |chapter-url=https://ieeexplore.ieee.org/document/9087013 |pages=50β53 |doi=10.1109/PARC49193.2020.236556 |isbn=978-1-7281-6575-2 |s2cid=218564267 |access-date=2022-11-06 |archive-date=2022-06-27 |archive-url=https://web.archive.org/web/20220627061928/https://ieeexplore.ieee.org/document/9087013/ |url-status=live }}</ref><ref>{{Cite book |last1=Hasan |first1=Fudail |last2=Kashevnik |first2=Alexey |title=2021 29th Conference of Open Innovations Association (FRUCT) |chapter=State-of-the-Art Analysis of Modern Drowsiness Detection Algorithms Based on Computer Vision |date=2021-05-14 |chapter-url=https://ieeexplore.ieee.org/document/9435480 |pages=141β149 |doi=10.23919/FRUCT52173.2021.9435480 |isbn=978-952-69244-5-8 |s2cid=235207036 |access-date=2022-11-06 |archive-date=2022-06-27 |archive-url=https://web.archive.org/web/20220627061552/https://ieeexplore.ieee.org/document/9435480/ |url-status=live }}</ref><ref>{{Cite journal |last1=Balasundaram |first1=A |last2=Ashokkumar |first2=S |last3=Kothandaraman |first3=D |last4=kora |first4=SeenaNaik |last5=Sudarshan |first5=E |last6=Harshaverdhan |first6=A |date=2020-12-01 |title=Computer vision based fatigue detection using facial parameters |journal=IOP Conference Series: Materials Science and Engineering |volume=981 |issue=2 |page=022005 |doi=10.1088/1757-899x/981/2/022005 |bibcode=2020MS&E..981b2005B |s2cid=230639179 |issn=1757-899X|doi-access=free }}</ref> * Tracking and counting organisms in the biological sciences<ref name="BruijningVisser2018">{{cite journal|last1=Bruijning|first1=Marjolein|last2=Visser|first2=Marco D.|last3=Hallmann|first3=Caspar A.|last4=Jongejans|first4=Eelke|last5=Golding|first5=Nick|title=trackdem: Automated particle tracking to obtain population counts and size distributions from videos in r |journal=Methods in Ecology and Evolution|volume=9|issue=4|pages=965β973|year=2018|issn=2041-210X|doi=10.1111/2041-210X.12975|bibcode=2018MEcEv...9..965B |doi-access=free|hdl=2066/184075|hdl-access=free}}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)