Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Perception
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Social === {{main|Social perception}}[[Social perception]] is the part of perception that allows people to understand the individuals and groups of their social world. Thus, it is an element of [[social cognition]].<ref>E. R. Smith, D. M. Mackie (2000). ''Social Psychology''. Psychology Press, 2nd ed., p. 20</ref> ==== Speech ==== {{main|Speech perception}} [[File:Spectrogram_of_I_owe_you.png|right|thumb|Though the phrase "I owe you" can be heard as three distinct words, a [[spectrogram]] reveals no clear boundaries.]] ''Speech perception'' is the process by which [[spoken language]] is heard, interpreted and understood. Research in this field seeks to understand how human listeners recognize the sound of speech (or ''[[phonetics]]'') and use such information to understand spoken language. Listeners manage to perceive words across a wide range of conditions, as the sound of a word can vary widely according to words that surround it and the [[tempo]] of the speech, as well as the physical characteristics, [[Accent (dialect)|accent]], [[Tone (linguistics)|tone]], and mood of the speaker. [[Reverberation]], signifying the persistence of sound after the sound is produced, can also have a considerable impact on perception. Experiments have shown that people automatically compensate for this effect when hearing speech.<ref name="eop_constancy" /><ref name="Watkins2010">{{cite book|chapter-url=https://books.google.com/books?id=ACkNL-G7gUUC&pg=PA440|title=The Neurophysiological Bases of Auditory Perception|last1=Watkins|first1=Anthony J.|last2=Raimond|first2=Andrew|last3=Makin|first3=Simon J.|date=23 March 2010|publisher=Springer|isbn=978-1-4419-5685-9|editor-last=Lopez-Poveda|editor-first=Enrique A.|page=440|chapter=Room reflection and constancy in speech-like sounds: Within-band effects|access-date=26 March 2011|archive-url=https://web.archive.org/web/20111109163241/http://books.google.com/books?id=ACkNL-G7gUUC&pg=PA440|archive-date=9 November 2011|url-status=live|bibcode=2010nbap.book.....L}}</ref> The process of perceiving speech begins at the level of the sound within the auditory signal and the process of [[Hearing (sense)|audition]]. The initial auditory signal is compared with visual information—primarily lip movement—to extract acoustic cues and phonetic information. It is possible other sensory modalities are integrated at this stage as well.<ref>{{cite book|chapter-url=https://books.google.com/books?id=EwY15naRiFgC&q=%22Primacy+of+Multimodal+Speech+Perception%22&pg=PA51|title=The Handbook of Speech Perception|last=Rosenblum|first=Lawrence D.|editor1-last=Pisoni|editor1-first=David|page=51|chapter=Primacy of Multimodal Speech Perception|date=15 April 2008|publisher=John Wiley & Sons |isbn=978-0-470-75677-5|editor2-last=Remez|editor2-first=Robert}}</ref> This speech information can then be used for higher-level language processes, such as [[word recognition]]. Speech perception is not necessarily uni-directional. Higher-level language processes connected with [[Morphology (linguistics)|morphology]], [[syntax]], and/or [[semantics]] may also interact with basic speech perception processes to aid in recognition of speech sounds.<ref>{{cite journal |last1=Davis |first1=Matthew H. |last2=Johnsrude |first2=Ingrid S. |title=Hearing speech sounds: Top-down influences on the interface between audition and speech perception |journal=Hearing Research |date=July 2007 |volume=229 |issue=1–2 |pages=132–147 |doi=10.1016/j.heares.2007.01.014|pmid=17317056 |s2cid=12111361 }}</ref> It may be the case that it is not necessary (maybe not even possible) for a listener to recognize [[phoneme]]s before recognizing higher units, such as words. In an experiment, professor Richard M. Warren replaced one phoneme of a word with a cough-like sound. His subjects restored the missing speech sound perceptually without any difficulty. Moreover, they were not able to accurately identify which phoneme had even been disturbed.<ref>{{cite journal|last=Warren|first=R. M.|year=1970|title=Restoration of missing speech sounds|journal=Science|volume=167|issue=3917|pages=392–393|doi=10.1126/science.167.3917.392|pmid=5409744|bibcode=1970Sci...167..392W|s2cid=30356740}}</ref> ==== Faces ==== {{main|Face perception}}''Facial perception'' refers to cognitive processes specialized in handling [[human faces]] (including perceiving the identity of an individual) and facial expressions (such as emotional cues.){{Reference needed|date=March 2024}} ==== Social touch ==== {{main|Somatosensory system#Neural processing of social touch}}The ''somatosensory cortex'' is a part of the brain that receives and encodes sensory information from receptors of the entire body.<ref>{{Cite web|url=https://human-memory.net/somatosensory-cortex/|title=Somatosensory Cortex|date=31 October 2019|website=The Human Memory|access-date=8 March 2020}}</ref> [[Affective|Affective touch]] is a type of sensory information that elicits an emotional reaction and is usually social in nature. Such information is actually coded differently than other sensory information. Though the intensity of affective touch is still encoded in the primary somatosensory cortex, the feeling of pleasantness associated with affective touch is activated more in the [[anterior cingulate cortex]]. Increased [[Blood-oxygen-level-dependent imaging|blood oxygen level-dependent]] (BOLD) contrast imaging, identified during [[functional magnetic resonance imaging]] (fMRI), shows that signals in the anterior cingulate cortex, as well as the [[prefrontal cortex]], are highly correlated with pleasantness scores of affective touch. Inhibitory [[transcranial magnetic stimulation]] (TMS) of the primary somatosensory cortex inhibits the perception of affective touch intensity, but not affective touch pleasantness. Therefore, the S1 is not directly involved in processing socially affective touch pleasantness, but still plays a role in discriminating touch location and intensity.<ref>{{cite journal|last1=Case|first1=LK|last2=Laubacher|first2=CM|last3=Olausson|first3=H|last4=Wang|first4=B|last5=Spagnolo|first5=PA|last6=Bushnell|first6=MC|title=Encoding of Touch Intensity But Not Pleasantness in Human Primary Somatosensory Cortex|journal=J Neurosci|volume=36|issue=21|pages=5850–60|doi=10.1523/JNEUROSCI.1130-15.2016|pmc=4879201|pmid=27225773|year=2016}}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)