Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Facial recognition system
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Cross-race effect bias === Facial recognition systems often demonstrate lower accuracy when identifying individuals with non-Eurocentric facial features. Known as the [[Cross-race effect]], this bias occurs when systems perform better on racial or ethnic groups that are overrepresented in their training data, resulting in reduced accuracy for underrepresented groups.<ref name="CrossRace2023">{{Cite arXiv |eprint=2305.16443 |title=Human-Machine Comparison for Cross-Race Face Verification: Race Bias at the Upper Limits of Performance |last1=Jeckeln |first1=Gabriel |year=2023|class=cs.CV }}</ref> The overrepresented group is generally the more populous group in the location that the model is being developed. For example, models developed in Asian cultures generally perform better on Asian facial features than Eurocentric facial features due to overrepresentation in the developers training dataset. The opposite is observed in models developed in Eurocentric cultures.<ref>{{Cite journal |last1=Phillips |first1=P. Jonathon |last2=Jiang |first2=Fang |last3=Narvekar |first3=Abhijit |last4=Ayyad |first4=Julianne |last5=O'Toole |first5=Alice J. |date=2011-02-02 |title=An other-race effect for face recognition algorithms |url=https://dl.acm.org/doi/10.1145/1870076.1870082 |journal=ACM Trans. Appl. Percept. |volume=8 |issue=2 |pages=14:1β14:11 |doi=10.1145/1870076.1870082 |issn=1544-3558|url-access=subscription }}</ref> The systems used for facial recognition often lack the sufficient training needed to fully recognize those features not of Eurocentric descent. When the training and databases for these [[Machine learning|Machine Learning]] (ML) models do not contain a diverse representation, the models fail to identify the missed population, adding to their racial biases.<ref name=":1" /> The cross-race effect is not exclusive to machines; humans also experience difficulty recognizing faces from racial or ethnic groups different from their own. This is an example of inherent human biases being perpetuated in training datasets.<ref name="Sangrigoli2004">{{Cite journal |last1=Sangrigoli |first1=Sophie |last2=de Schonen |first2=Scania |title=Recognition of Own-Race and Other-Race Faces by Three-Month-Old Infants |journal=Journal of Child Psychology and Psychiatry |volume=45 |issue=7 |year=2004 |pages=1219β1227 |doi=10.1111/j.1469-7610.2004.00319.x|pmid=15335342 }}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)