Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Facial Action Coding System
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Uses == === Baby FACS === Baby FACS (Facial Action Coding System for Infants and Young Children)<ref>{{cite book |last1=Oster |first1=Harriet |title=Baby FACS: Facial Action Coding System for Infants and Young Children |date=2006 |publisher=Unpublished monograph and coding manual. New York University. |location=New York}}</ref> is a behavioral coding system that adapts the adult FACS to code facial expressions in infants aged 0β2 years. It corresponds to specific underlying facial muscles, tailored to infant facial anatomy and expression patterns. It was created by Dr. Harriet Oster and colleagues to address the limitations of applying adult FACS directly to infants, whose facial musculature, proportions, and developmental capabilities differ significantly. === Use in medicine === The use of the FACS has been proposed for use in the analysis of [[Clinical depression|depression]],<ref name="pmid18020726">{{cite journal | vauthors = Reed LI, Sayette MA, Cohn JF | title = Impact of depression on response to comedy: a dynamic facial coding analysis | journal = Journal of Abnormal Psychology | volume = 116 | issue = 4 | pages = 804β9 | date = November 2007 | pmid = 18020726 | doi = 10.1037/0021-843X.116.4.804 | citeseerx = 10.1.1.307.6950 }}</ref> and the measurement of pain in patients unable to express themselves verbally.<ref name="pmid18028046">{{cite journal | vauthors = Lints-Martindale AC, Hadjistavropoulos T, Barber B, Gibson SJ | title = A psychophysical investigation of the facial action coding system as an index of pain variability among older adults with and without Alzheimer's disease | journal = Pain Medicine | volume = 8 | issue = 8 | pages = 678β89 | year = 2007 | pmid = 18028046 | doi = 10.1111/j.1526-4637.2007.00358.x | doi-access = free }}</ref> === Cross-species applications === The original FACS has been modified to analyze facial movements in several non-human primates, namely [[Common chimpanzee|chimpanzee]]s,<ref name="pmid17352572">{{cite journal | vauthors = Parr LA, Waller BM, Vick SJ, Bard KA | title = Classifying chimpanzee facial expressions using muscle action | journal = Emotion | volume = 7 | issue = 1 | pages = 172β81 | date = February 2007 | pmid = 17352572 | pmc = 2826116 | doi = 10.1037/1528-3542.7.1.172 }}</ref> rhesus macaques,<ref>{{cite journal | vauthors = Parr LA, Waller BM, Burrows AM, Gothard KM, Vick SJ | title = Brief communication: MaqFACS: A muscle-based facial movement coding system for the rhesus macaque | journal = American Journal of Physical Anthropology | volume = 143 | issue = 4 | pages = 625β30 | date = December 2010 | pmid = 20872742 | pmc = 2988871 | doi = 10.1002/ajpa.21401 }}</ref> gibbons and siamangs,<ref>{{Cite journal | vauthors = Waller BM, Lembeck M, Kuchenbuch P, Burrows AM, Liebal K | title = GibbonFACS: A Muscle-Based Facial Movement Coding System for Hylobatids | doi = 10.1007/s10764-012-9611-6 | journal = International Journal of Primatology | volume = 33 | issue = 4 | pages = 809β821 | year = 2012 | s2cid = 18321096 }}</ref> and orangutans.<ref>{{Cite journal | vauthors = Caeiro CC, Waller BM, Zimmermann E, Burrows AM, Davila-Ross M | title = OrangFACS: A Muscle-Based Facial Movement Coding System for Orangutans (''Pongo'' spp.) | doi = 10.1007/s10764-012-9652-x | journal = International Journal of Primatology | volume = 34 | pages = 115β129 | year = 2012 | s2cid = 17612028 | url=http://irep.ntu.ac.uk/id/eprint/41473/1/1383920_Waller.pdf}}</ref> More recently, it was developed also for domestic species, including dogs,<ref>{{cite journal | vauthors = Waller BM, Peirce K, Caeiro CC, Scheider L, Burrows AM, McCune S, Kaminski J | title = Paedomorphic facial expressions give dogs a selective advantage | journal = PLOS ONE | volume = 8 | issue = 12 | pages = e82686 | year = 2013 | pmid = 24386109 | pmc = 3873274 | doi = 10.1371/journal.pone.0082686 | bibcode = 2013PLoSO...882686W | doi-access = free }}</ref> horses<ref>{{cite journal | vauthors = Wathan J, Burrows AM, Waller BM, McComb K | title = EquiFACS: The Equine Facial Action Coding System | journal = PLOS ONE | volume = 10 | issue = 8 | pages = e0131738 | date = 2015-08-05 | pmid = 26244573 | pmc = 4526551 | doi = 10.1371/journal.pone.0131738 | bibcode = 2015PLoSO..1031738W | doi-access = free }}</ref> and cats.<ref>{{Cite journal| vauthors = Caeiro CC, Burrows AM, Waller BM |date=2017-04-01|title=Development and application of CatFACS: Are human cat adopters influenced by cat facial expressions?|journal=Applied Animal Behaviour Science|volume=189|pages=66β78|doi=10.1016/j.applanim.2017.01.005|issn=0168-1591|url=http://eprints.lincoln.ac.uk/25940/1/25940%20Proof_APPLAN_4392.pdf}}</ref> Similarly to the human FACS, the animal FACS has manuals available online for each species with the respective certification tests.<ref>{{Cite web|url=http://animalfacs.com|title=Home|website=animalfacs.com|access-date=2019-10-23}}</ref> Thus, the FACS can be used to compare facial repertoires across species due to its anatomical basis. A study conducted by Vick and others (2006) suggests that the FACS can be modified by taking differences in underlying morphology into account. Such considerations enable a comparison of the homologous facial movements present in humans and chimpanzees, to show that the facial expressions of both species result from extremely notable appearance changes. The development of FACS tools for different species allows the objective and anatomical study of facial expressions in communicative and emotional contexts. Furthermore, a cross-species analysis of facial expressions can help to answer interesting questions, such as which emotions are uniquely human.<ref>{{cite journal | vauthors = Vick SJ, Waller BM, Parr LA, Smith Pasqualini MC, Bard KA | title = A Cross-species Comparison of Facial Morphology and Movement in Humans and Chimpanzees Using the Facial Action Coding System (FACS) | journal = Journal of Nonverbal Behavior | volume = 31 | issue = 1 | pages = 1β20 | date = March 2007 | pmid = 21188285 | pmc = 3008553 | doi = 10.1007/s10919-006-0017-z }}</ref> The Emotional Facial Action Coding System (EMFACS)<ref>{{citation | vauthors = Friesen W, Ekman P | title = EMFACS-7: Emotional Facial Action Coding System. Unpublished manuscript | publisher = University of California at San Francisco | date = 1983 | volume = 2 | issue = 36 | pages = 1 }}</ref> and the Facial Action Coding System Affect Interpretation Dictionary (FACSAID)<ref>{{Cite web |url=http://www.face-and-emotion.com/dataface/facsaid/description.jsp |title=Facial Action Coding System Affect Interpretation Dictionary (FACSAID) |access-date=2011-02-23 |archive-url=https://web.archive.org/web/20110520164308/http://face-and-emotion.com/dataface/facsaid/description.jsp |archive-date=2011-05-20 |url-status=dead }}</ref> consider only emotion-related facial actions. Examples of these are: {| class="wikitable sortable" |- ! Emotion !! Action units |- | Happiness ||6+12 |- | Sadness || 1+4+15 |- | Surprise || 1+2+5B+26 |- | Fear || 1+2+4+5+7+20+26 |- | Anger || 4+5+7+23 |- | Disgust || 9+15+17 |- | Contempt || R12A+R14A |} === Computer-generated imagery === FACS coding is also used extensively in [[computer animation]], in particular for [[computer facial animation]], with facial expressions being expressed as [[vector graphics]] of AUs.<ref>{{Cite news |last=Walsh |first=Joseph |date=2016-12-16 |title=Rogue One: the CGI resurrection of Peter Cushing is thrilling β but is it right? |language=en-GB |work=The Guardian |url=https://www.theguardian.com/film/filmblog/2016/dec/16/rogue-one-star-wars-cgi-resurrection-peter-cushing |access-date=2023-10-23 |issn=0261-3077}}</ref> FACS vectors are used as weights for [[Morph target animation|blend shape]]s corresponding to each AU, with the resulting face mesh then being used to render the finished face.<ref>{{Cite journal |last1=Gilbert |first1=MichaΓ«l |last2=Demarchi |first2=Samuel |last3=Urdapilleta |first3=Isabel |date=October 2021 |title=FACSHuman, a software program for creating experimental material by modeling 3D facial expressions |url=https://link.springer.com/10.3758/s13428-021-01559-9 |journal=Behavior Research Methods |language=en |volume=53 |issue=5 |pages=2252β2272 |doi=10.3758/s13428-021-01559-9 |pmid=33825127 |issn=1554-3528}}</ref><ref>{{Cite web |title=Discover how to create FACS facial blendshapes in Maya {{!}} CG Channel |url=https://www.cgchannel.com/2021/04/discover-how-to-create-facs-facial-blendshapes-in-maya/ |access-date=2023-10-23 |language=en-US}}</ref> [[Deep learning]] techniques can be used to determine the FACS vectors from face images obtained during [[Motion capture|motion capture acting]], [[facial motion capture]] or other performances.<ref>{{Cite book |url=https://ieeexplore.ieee.org/document/7284873 |access-date=2023-10-23 |date=2015 |doi=10.1109/FG.2015.7284873 |language=en-US |last1=Gudi |first1=Amogh |last2=Tasli |first2=H. Emrah |last3=Den Uyl |first3=Tim M. |last4=Maroulis |first4=Andreas |title=2015 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG) |chapter=Deep learning based FACS Action Unit occurrence and intensity estimation |pages=1β5 |isbn=978-1-4799-6026-2 |s2cid=6283665 }}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)