Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Facial recognition system
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Imperfect technology in law enforcement === {{as of|2018|post=,}} it is still contested as to whether or not facial recognition technology works less accurately on people of color.<ref>{{Cite magazine|url=https://www.wired.com/story/photo-algorithms-id-white-men-fineblack-women-not-so-much/|title=Photo Algorithms ID White Men FineβBlack Women, Not So Much|magazine=WIRED|access-date=April 10, 2018|language=en-US}}</ref> One study by [[Joy Buolamwini]] (MIT Media Lab) and [[Timnit Gebru]] (Microsoft Research) found that the error rate for gender recognition for women of color within three commercial facial recognition systems ranged from 23.8% to 36%, whereas for lighter-skinned men it was between 0.0 and 1.6%. Overall accuracy rates for identifying men (91.9%) were higher than for women (79.4%), and none of the systems accommodated a non-binary understanding of gender.<ref>{{cite conference|url=http://proceedings.mlr.press/v81/buolamwini18a.html|title=Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification|author1=Joy Buolamwini|author2=Timnit Gebru|year=2018|book-title=Proceedings of Machine Learning Research |volume=81|pages=77β91|language=en|access-date=March 8, 2018|issn=1533-7928}}</ref> It also showed that the datasets used to train commercial facial recognition models were unrepresentative of the broader population and skewed toward lighter-skinned males. However, another study showed that several commercial facial recognition software sold to law enforcement offices around the country had a lower false non-match rate for black people than for white people.<ref>{{Cite web|url=https://nvlpubs.nist.gov/nistpubs/Legacy/IR/nistir7709.pdf|title=Report on the Evaluation of 2D Still-Image Face Recognition Algorithms|last1=Grother|first1=Patrick|last2=Quinn|first2=George|date=August 24, 2011|website=National Institute of Standards and Technology|last3=Phillips|first3=P. Jonathon}}</ref> Experts fear that face recognition systems may actually be hurting citizens the police claims they are trying to protect.<ref>{{Cite web|url=https://www.theguardian.com/inequality/2017/aug/08/rise-of-the-racist-robots-how-ai-is-learning-all-our-worst-impulses|title=Rise of the racist robots β how AI is learning all our worst impulses|last=Buranyi|first=Stephen|date=August 8, 2017|website=The Guardian|language=en|access-date=April 10, 2018}}</ref> It is considered an imperfect biometric, and in a study conducted by Georgetown University researcher Clare Garvie, she concluded that "there's no consensus in the scientific community that it provides a positive identification of somebody."<ref name=":4">{{Cite web|url=https://www.theguardian.com/technology/2017/dec/04/racist-facial-recognition-white-coders-black-people-police|title=How white engineers built racist code β and why it's dangerous for black people|last=Brel|first=Ali|date=December 4, 2017|website=The Guardian|language=en|access-date=April 10, 2018}}</ref> It is believed that with such large margins of error in this technology, both legal advocates and facial recognition software companies say that the technology should only supply a portion of the case β no evidence that can lead to an arrest of an individual.<ref name=":4" /> The lack of regulations holding facial recognition technology companies to requirements of racially biased testing can be a significant flaw in the adoption of use in law enforcement. CyberExtruder, a company that markets itself to law enforcement said that they had not performed testing or research on bias in their software. CyberExtruder did note that some skin colors are more difficult for the software to recognize with current limitations of the technology. "Just as individuals with very dark skin are hard to identify with high significance via facial recognition, individuals with very pale skin are the same," said Blake Senftner, a senior software engineer at CyberExtruder.<ref name=":4" /> The United States' National Institute of Standards and Technology (NIST) carried out extensive testing of FRT system 1:1 verification<ref name="meilee.ngan@nist.gov">{{Cite web|date=2016-12-14|title=Face Recognition Vendor Test (FRVT) Ongoing|url=https://www.nist.gov/programs-projects/face-recognition-vendor-test-frvt-ongoing|access-date=2022-02-15|website=NIST|language=en}}</ref> and 1:many identification.<ref name="meilee.ngan@nist.gov"/> It also tested for the differing accuracy of FRT across different demographic groups. The independent study concluded at present, no FRT system has 100% accuracy.<ref>{{Cite web|last1=Grother|first1=Patrick J.|last2=Ngan|first2=Mei L.|last3=Hanaoka|first3=Kayee K.|date=2019-12-19|title=Face Recognition Vendor Test Part 3: Demographic Effects|url=https://www.nist.gov/publications/face-recognition-vendor-test-part-3-demographic-effects|language=en|website=nist.gov}}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)