Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Eye tracking
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Technologies and techniques== The most widely used current designs are video-based eye-trackers. A camera focuses on one or both eyes and records eye movement as the viewer looks at some kind of stimulus. Most modern eye-trackers use the center of the pupil and [[infrared]] / [[near-infrared]] non-collimated light to create [[corneal reflection]]s (CR). The vector between the pupil center and the corneal reflections can be used to compute the point of regard on surface or the gaze direction. A simple calibration procedure of the individual is usually needed before using the eye tracker.<ref>{{cite journal|last=Witzner Hansen|first=Dan|author2=Qiang Ji|title=In the Eye of the Beholder: A Survey of Models for Eyes and Gaze|journal=IEEE Trans. Pattern Anal. Mach. Intell.|date=March 2010|volume=32|issue=3|pages=478–500|url=http://dl.acm.org/citation.cfm?id=1729561|doi=10.1109/tpami.2009.30|pmid=20075473|s2cid=16489508|url-access=subscription}}</ref> Two general types of infrared / near-infrared (also known as active light) eye-tracking techniques are used: bright-pupil and dark-pupil. Their difference is based on the location of the illumination source with respect to the optics. If the illumination is [[coaxial]] with the optical path, then the eye acts as a [[retroreflector]] as the light reflects off the [[retina]] creating a bright pupil effect similar to [[Red-eye effect|red eye]]. If the illumination source is offset from the optical path, then the pupil appears dark because the retroreflection from the retina is directed away from the camera.<ref name=gneo>{{cite journal|last1=Gneo|first1=Massimo|last2=Schmid|first2=Maurizio|last3=Conforto|first3=Silvia|last4=D’Alessio|first4=Tommaso|title=A free geometry model-independent neural eye-gaze tracking system|journal=Journal of NeuroEngineering and Rehabilitation|date=2012|volume=9|issue=1|pages=82|doi=10.1186/1743-0003-9-82|pmid=23158726|pmc=3543256 |doi-access=free }}<!--|access-date=9 March 2015--></ref> Bright-pupil tracking creates greater iris/pupil contrast, allowing more robust eye-tracking with all iris pigmentation, and greatly reduces interference caused by eyelashes and other obscuring features.<ref>The Eye: A Survey of Human Vision; Wikimedia Foundation</ref> It also allows tracking in lighting conditions ranging from total darkness to very bright. Another, less used, method is known as passive light. It uses visible light to illuminate, something which may cause some distractions to users.<ref name=gneo/> Another challenge with this method is that the contrast of the pupil is less than in the active light methods, therefore, the center of [[Iris (anatomy)|iris]] is used for calculating the vector instead.<ref>{{cite journal|last1=Sigut|first1=J|last2=Sidha|first2=SA|title=Iris center corneal reflection method for gaze tracking using visible light.|journal=IEEE Transactions on Bio-Medical Engineering|date=February 2011|volume=58|issue=2|pages=411–9|pmid=20952326|doi=10.1109/tbme.2010.2087330|s2cid=206611506}}<!--|access-date=9 March 2015--></ref> This calculation needs to detect the boundary of the iris and the white [[sclera]] ([[Corneal limbus|limbus]] tracking). It presents another challenge for vertical eye movements due to obstruction of eyelids.<ref>{{cite journal|last1=Hua|first1=H|last2=Krishnaswamy|first2=P|last3=Rolland|first3=JP|title=Video-based eyetracking methods and algorithms in head-mounted displays.|journal=Optics Express|date=15 May 2006|volume=14|issue=10|pages=4328–50|pmid=19516585|doi=10.1364/oe.14.004328|bibcode=2006OExpr..14.4328H|url=https://stars.library.ucf.edu/facultybib2000/6233|doi-access=free}}</ref> <gallery mode="packed" class="center"> File:Bright pupil by infrared or near infrared illumination.jpg|Infrared / near-infrared: bright pupil. File:Dark pupil by infrared or near infrared illumination.jpg|Infrared / near-infrared: dark pupil and corneal reflection. File:Visible light eye-tracking algorithm.jpg|Visible light: center of iris (red), corneal reflection (green), and output vector (blue). </gallery> Eye-tracking setups vary greatly. Some are head-mounted, some require the head to be stable (for example, with a chin rest), and some function remotely and automatically track the head during motion. Most use a sampling rate of at least 30 Hz. Although 50/60 Hz is more common, today many video-based eye trackers run at 240, 350 or even 1000/1250 Hz, speeds needed to capture fixational eye movements or correctly measure saccade dynamics. Eye movements are typically divided into [[fixation (visual)|fixations]] and saccades – when the eye gaze pauses in a certain position, and when it moves to another position, respectively. The resulting series of fixations and saccades is called a [[scanpath]]. Smooth pursuit describes the eye following a moving object. Fixational eye movements include [[microsaccade]]s: small, involuntary saccades that occur during attempted fixation. Most information from the eye is made available during a fixation or smooth pursuit, but not during a saccade.<ref>{{cite book |last=Purves |first=D |display-authors=etal |date=2001 |title=Neuroscience |edition=2nd |url=https://www.ncbi.nlm.nih.gov/books/NBK11156/ |location=Sunderland, MA |publisher=Sinauer Assocs |chapter=What Eye Movements Accomplish}}</ref> Scanpaths are useful for analyzing cognitive intent, interest, and salience. Other biological factors (some as simple as gender) may affect the scanpath as well. Eye tracking in [[human–computer interaction]] (HCI) typically investigates the scanpath for usability purposes, or as a method of input in [[gaze-contingency paradigm|gaze-contingent displays]], also known as [[gaze-based interfaces]].<ref>Majaranta, P., Aoki, H., Donegan, M., Hansen, D.W., Hansen, J.P., Hyrskykari, A., Räihä, K.J., ''Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies'', IGI Global, 2011</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)