Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Affective computing
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Physiological monitoring=== This could be used to detect a user's affective state by monitoring and analyzing their physiological signs. These signs range from changes in heart rate and skin conductance to minute contractions of the facial muscles and changes in facial blood flow. This area is gaining momentum and we are now seeing real products that implement the techniques. The four main physiological signs that are usually analyzed are [[Pulse|blood volume pulse]], [[Skin conductance|galvanic skin response]], [[facial electromyography]], and facial color patterns. ====Blood volume pulse==== =====Overview===== A subject's blood volume pulse (BVP) can be measured by a process called [[photoplethysmography]], which produces a graph indicating blood flow through the extremities.<ref name="Picard, Rosalind 1998">Picard, Rosalind (1998). Affective Computing. MIT.</ref> The peaks of the waves indicate a cardiac cycle where the heart has pumped blood to the extremities. If the subject experiences fear or is startled, their heart usually 'jumps' and beats quickly for some time, causing the amplitude of the cardiac cycle to increase. This can clearly be seen on a photoplethysmograph when the distance between the trough and the peak of the wave has decreased. As the subject calms down, and as the body's inner core expands, allowing more blood to flow back to the extremities, the cycle will return to normal. =====Methodology===== Infra-red light is shone on the skin by special sensor hardware, and the amount of light reflected is measured. The amount of reflected and transmitted light correlates to the BVP as light is absorbed by hemoglobin which is found richly in the bloodstream. =====Disadvantages===== It can be cumbersome to ensure that the sensor shining an infra-red light and monitoring the reflected light is always pointing at the same extremity, especially seeing as subjects often stretch and readjust their position while using a computer. There are other factors that can affect one's blood volume pulse. As it is a measure of blood flow through the extremities, if the subject feels hot, or particularly cold, then their body may allow more, or less, blood to flow to the extremities, all of this regardless of the subject's emotional state. [[File:Em-face-2.png|thumb|left| The corrugator supercilii muscle and zygomaticus major muscle are the 2 main muscles used for measuring the electrical activity, in facial electromyography.]] ====Facial electromyography==== {{Main|Facial electromyography}} Facial electromyography is a technique used to measure the electrical activity of the facial muscles by amplifying the tiny electrical impulses that are generated by muscle fibers when they contract.<ref name="Larsen JT 2003">Larsen JT, Norris CJ, Cacioppo JT, "[https://web.archive.org/web/20181030170423/https://pdfs.semanticscholar.org/c3a5/4bfbaaade376aee951fe8578e6436be59861.pdf Effects of positive and negative affect on electromyographic activity over zygomaticus major and corrugator supercilii]", (September 2003)</ref> The face expresses a great deal of emotion, however, there are two main facial muscle groups that are usually studied to detect emotion: The corrugator supercilii muscle, also known as the 'frowning' muscle, draws the brow down into a frown, and therefore is the best test for negative, unpleasant emotional response.β΅The zygomaticus major muscle is responsible for pulling the corners of the mouth back when you smile, and therefore is the muscle used to test for a positive emotional response. [[File:Gsrplot.svg|500px|thumb|Here we can see a plot of skin resistance measured using GSR and time whilst the subject played a video game. There are several peaks that are clear in the graph, which suggests that GSR is a good method of differentiating between an aroused and a non-aroused state. For example, at the start of the game where there is usually not much exciting game play, there is a high level of resistance recorded, which suggests a low level of conductivity and therefore less arousal. This is in clear contrast with the sudden trough where the player is killed as one is usually very stressed and tense as their character is killed in the game.]] ====Galvanic skin response==== {{Main|Galvanic skin response}} Galvanic skin response (GSR) is an outdated term for a more general phenomenon known as [[Electrodermal activity]] or EDA. EDA is a general phenomena whereby the skin's electrical properties change. The skin is innervated by the [sympathetic nervous system], so measuring its resistance or conductance provides a way to quantify small changes in the sympathetic branch of the autonomic nervous system. As the sweat glands are activated, even before the skin feels sweaty, the level of the EDA can be captured (usually using conductance) and used to discern small changes in autonomic arousal. The more aroused a subject is, the greater the skin conductance tends to be.<ref name="Picard, Rosalind 1998"/> Skin conductance is often measured using two small [[silver-silver chloride]] electrodes placed somewhere on the skin and applying a small voltage between them. To maximize comfort and reduce irritation the electrodes can be placed on the wrist, legs, or feet, which leaves the hands fully free for daily activity. ====Facial color==== =====Overview===== The surface of the human face is innervated with a large network of blood vessels. Blood flow variations in these vessels yield visible color changes on the face. Whether or not facial emotions activate facial muscles, variations in blood flow, blood pressure, glucose levels, and other changes occur. Also, the facial color signal is independent from that provided by facial muscle movements.<ref name="face">{{cite journal | last1=Benitez-Quiroz | first1=Carlos F. | last2=Srinivasan | first2=Ramprakash | last3=Martinez | first3=Aleix M. | title=Facial color is an efficient mechanism to visually transmit emotion | journal=Proceedings of the National Academy of Sciences | volume=115 | issue=14 | date=2018-03-19 | doi=10.1073/pnas.1716084115 | pages=3581β3586| pmid=29555780 | pmc=5889636 | bibcode=2018PNAS..115.3581B | doi-access=free }}</ref> =====Methodology===== Approaches are based on facial color changes. Delaunay triangulation is used to create the triangular local areas. Some of these triangles which define the interior of the mouth and eyes (sclera and iris) are removed. Use the left triangular areasβ pixels to create feature vectors.<ref name="face"/> It shows that converting the pixel color of the standard RGB color space to a color space such as oRGB color space<ref name="orgb">{{cite journal | last1=Bratkova | first1=Margarita | last2=Boulos | first2=Solomon | last3=Shirley | first3=Peter | title=oRGB: A Practical Opponent Color Space for Computer Graphics | journal=IEEE Computer Graphics and Applications | volume=29 | issue=1 | year=2009 | doi=10.1109/mcg.2009.13 | pages=42β55| pmid=19363957 | s2cid=16690341 }}</ref> or LMS channels perform better when dealing with faces.<ref name="mec">Hadas Shahar, [[Hagit Hel-Or]], [http://openaccess.thecvf.com/content_ICCVW_2019/papers/CVPM/Shahar_Micro_Expression_Classification_using_Facial_Color_and_Deep_Learning_Methods_ICCVW_2019_paper.pdf Micro Expression Classification using Facial Color and Deep Learning Methods], The IEEE International Conference on Computer Vision (ICCV), 2019, pp. 0β0.</ref> So, map the above vector onto the better color space and decompose into red-green and yellow-blue channels. Then use deep learning methods to find equivalent emotions.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)