Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Eye tracking
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Applications == A wide variety of disciplines use eye-tracking techniques, including [[cognitive science]]; [[psychology]] (notably [[psycholinguistics]]; the visual world paradigm); [[human-computer interaction]] (HCI); [[human factors and ergonomics]]; [[marketing research]] and medical research (neurological diagnosis).<ref>{{cite journal |last1=Duchowski |first1=A. T. |title=A breadth-first survey of eye-tracking applications |journal=Behavior Research Methods, Instruments, & Computers |date=2002 |volume=34 |issue=4|pages=455–470 |doi=10.3758/BF03195475 |pmid=12564550 |s2cid=4361938 |doi-access=free }}</ref> Specific applications include the tracking eye movement in [[eye movement in language reading|language reading]], [[eye movement in music reading|music reading]], human [[activity recognition]], the perception of advertising, playing of sports, distraction detection and [[cognitive load]] estimation of drivers and pilots and as a means of operating computers by people with severe motor impairment.<ref name="bop.unibe.ch" /> In the field of virtual reality, eye tracking is used in head mounted displays for a variety of purposes including to reduce processing load by only rendering the graphical area within the user's gaze.<ref>{{Cite web|last=Rogers|first=Sol|title=Seven Reasons Why Eye-tracking Will Fundamentally Change VR|url=https://www.forbes.com/sites/solrogers/2019/02/05/seven-reasons-why-eye-tracking-will-fundamentally-change-vr/|access-date=2021-12-16|website=Forbes|language=en}}</ref> === Commercial applications === In recent years, the increased sophistication and accessibility of eye-tracking technologies have generated a great deal of interest in the commercial sector. Applications include [[web usability]], advertising, sponsorship, package design and automotive engineering. In general, commercial eye-tracking studies function by presenting a target stimulus to a sample of consumers while an eye tracker records eye activity. Examples of target stimuli may include websites, television programs, sporting events, films and commercials, magazines and newspapers, packages, shelf displays, consumer systems (ATMs, checkout systems, kiosks) and software. The resulting data can be statistically analyzed and graphically rendered to provide evidence of specific visual patterns. By examining fixations, [[saccades]], pupil dilation, blinks and a variety of other behaviors, researchers can determine a great deal about the effectiveness of a given medium or product. While some companies complete this type of research internally, there are many private companies that offer eye-tracking services and analysis. One field of commercial eye-tracking research is web usability. While traditional usability techniques are often quite powerful in providing information on clicking and scrolling patterns, eye-tracking offers the ability to analyze user interaction between the clicks and how much time a user spends between clicks, thereby providing valuable insight into which features are the most eye-catching, which features cause confusion and which are ignored altogether. Specifically, eye-tracking can be used to assess search efficiency, branding, online advertisements, navigation usability, overall design and many other site components. Analyses may target a prototype or competitor site in addition to the main client site. Eye-tracking is commonly used in a variety of different advertising media. Commercials, print ads, online ads and sponsored programs are all conducive to analysis with current eye-tracking technology. One example is the analysis of eye movements over advertisements in the [[Yellow pages|Yellow Pages]]. One study focused on what particular features caused people to notice an ad, whether they viewed ads in a particular order and how viewing times varied. The study revealed that ad size, graphics, color, and copy all influence attention to advertisements. Knowing this allows researchers to assess in great detail how often a sample of consumers fixates on the target logo, product or ad. Hence an advertiser can quantify the success of a given campaign in terms of actual visual attention.<ref>{{cite journal|last1=Lohse |first1=Gerald |last2=Wu |first2=D. J. |s2cid=1064385 |title=Eye Movement Patterns on Chinese Yellow Pages Advertising |journal=Electronic Markets |date=1 February 2001 |volume=11 |issue=2 |pages=87–96 |doi=10.1080/101967801300197007 }}</ref> Another example of this is a study that found that in a [[search engine results page]], authorship snippets received more attention than the paid ads or even the first organic result.<ref>[http://www.searchenginejournal.com/eye-tracking-study-importance-using-google-authorship-search-results/71207/ "Eye Tracking Study: The Importance of Using Google Authorship in Search Results"]</ref> Yet another example of commercial eye-tracking research comes from the field of recruitment. A study analyzed how recruiters screen [[LinkedIn]] profiles and presented results as [[heat map]]s.<ref>{{Cite web|date=2019-02-21|title=3 seconds is enough to screen candidate's profile. Eye tracking research results.|url=https://elementapp.ai/blog/3-seconds-to-screen-candidate-profile-biometric-research-results/|access-date=2021-04-03|website=Element's Blog - nowości ze świata rekrutacji, HR Tech i Element|language=pl-PL}}</ref> === Safety applications === Scientists in 2017 constructed a Deep Integrated Neural Network (DINN) out of a Deep Neural Network and a convolutional neural network.<ref name=":0">{{Cite journal|last1=Zhao|first1=Lei|last2=Wang|first2=Zengcai|last3=Zhang|first3=Guoxin|last4=Qi|first4=Yazhou|last5=Wang|first5=Xiaojin|date=15 November 2017|title=Eye state recognition based on deep integrated neural network and transfer learning|journal=Multimedia Tools and Applications|volume=77|issue=15|pages=19415–19438|doi=10.1007/s11042-017-5380-8|s2cid=20691291|issn=1380-7501}}</ref> The goal was to use [[deep learning]] to examine images of drivers and determine their level of drowsiness by "classify[ing] eye states." With enough images, the proposed DINN could ideally determine when drivers blink, how often they blink, and for how long. From there, it could judge how tired a given driver appears to be, effectively conducting an eye-tracking exercise. The DINN was trained on data from over 2,400 subjects and correctly diagnosed their states 96%-99.5% of the time. Most other artificial intelligence models performed at rates above 90%.<ref name=":0" /> This technology could ideally provide another avenue for [[driver drowsiness detection]]. === Game theory applications === In a 2019 study, a Convolutional Neural Network (CNN) was constructed with the ability to identify individual chess pieces the same way other CNNs can identify facial features.<ref name=":1">{{Cite book|last1=Louedec|first1=Justin Le|last2=Guntz|first2=Thomas|last3=Crowley|first3=James L.|last4=Vaufreydaz|first4=Dominique|title=Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications |chapter=Deep learning investigation for chess player attention prediction using eye-tracking and game data |date=2019|pages=1–9|location=New York, New York, USA|publisher=ACM Press|doi=10.1145/3314111.3319827|isbn=978-1-4503-6709-7|arxiv=1904.08155|bibcode=2019arXiv190408155L|s2cid=118688325}}</ref> It was then fed eye-tracking input data from 30 chess players of various skill levels. With this data, the CNN used gaze estimation to determine parts of the chess board to which a player was paying close attention. It then generated a saliency map to illustrate those parts of the board. Ultimately, the CNN would combine its knowledge of the board and pieces with its saliency map to predict the players' next move. Regardless of the [[Training data set|training dataset]] the neural network system was trained upon, it predicted the next move more accurately than if it had selected any possible move at random, and the saliency maps drawn for any given player and situation were more than 54% similar.<ref name=":1" /> === Assistive technology === People with severe motor impairment can use eye tracking for interacting with computers<ref>{{cite book|last1 = Corno |first1= F. |last2= Farinetti |first2= L.|last3= Signorile |first3= I.|title= Proceedings. IEEE International Conference on Multimedia and Expo |chapter= A cost-effective solution for eye-gaze assistive technology |date= August 2002|chapter-url= https://ieeexplore.ieee.org/document/1035632|volume= 2|pages= 433–436 |doi= 10.1109/ICME.2002.1035632 |isbn= 0-7803-7304-9 |s2cid= 42361339 |access-date= 5 August 2020}}</ref> as it is faster than single switch scanning techniques and intuitive to operate.<ref>{{cite journal|last1 = Pinheiro |first1=C. |last2= Naves|first2= E. L.|last3= Pino |first3= P.|last4= Lesson|first4= E.|last5= Andrade|first5= A.O.|last6= Bourhis|first6= G.|date=July 2011|title=Alternative communication systems for people with severe motor disabilities: a survey|journal= BioMedical Engineering OnLine|volume= 10|issue=1|page=31 |doi=10.1186/1475-925X-10-31 |pmid=21507236 |pmc=3103465 |doi-access=free }}</ref><ref>{{cite journal |last1 = Saunders|first1= M.D. |last2= Smagner |first2=J.P.|last3= Saunders |first3=R.R.|date=August 2003|title=Improving methodological and technological analyses of adaptive switch use of individuals with profound multiple impairments|journal=Behavioral Interventions |volume= 18|issue=4|pages= 227–243|doi= 10.1002/bin.141 }}</ref> Motor impairment caused by Cerebral Palsy<ref>{{cite web |title= Cerebral Palsy (CP) |url= https://www.cdc.gov/ncbddd/cp/facts.html|access-date=4 August 2020}}</ref> or [[Amyotrophic lateral sclerosis]] often affects speech, and users with Severe Speech and Motor Impairment (SSMI) use a type of software known as Augmentative and Alternative Communication (AAC) aid,<ref>{{cite journal |last1 = Wilkinson |first1= K.M. |last2= Mitchell|first2=T.|date=March 2014|title=Eye tracking research to answer questions about augmentative and alternative communication assessment and intervention|journal= Augmentative and Alternative Communication |volume= 30| issue=2|pages= 106–119|doi= 10.3109/07434618.2014.904435 |pmid= 24758526 |pmc= 4327869 }}</ref> that displays icons, words and letters on screen<ref>{{cite journal |last1 = Galante |first1= A. |last2= Menezes |first2= P. |date=June 2012|title=A gaze-based interaction system for people with cerebral palsy|journal= Procedia Technology| volume= 5|pages= 895–902|doi= 10.1016/j.protcy.2012.09.099 |doi-access= free}}</ref> and uses text-to-speech software to generate spoken output.<ref>{{cite journal |last1 = BLISCHAK |first1= D. |last2= LOMBARDINO |first2= L. |last3= DYSON |first3= A. |date=June 2003|title=Use of speech-generating devices: In support of natural speech|journal= Augmentative and Alternative Communication|volume=19| issue=1|pages= 29–35|doi= 10.1080/0743461032000056478 |pmid= 28443791 |s2cid= 205581902 }}</ref> In recent times, researchers also explored eye tracking to control robotic arms<ref>{{cite journal|last1 = Sharma|first1= V.K. |last2=Murthy |first2=L. R. D.|last3= Singh Saluja |first3= K.|last4= Mollyn |first4= V.|last5= Sharma|first5= G.|last6= Biswas|first6= Pradipta|date=August 2020|title=Webcam controlled robotic arm for persons with SSMI | url= https://content.iospress.com/articles/technology-and-disability/tad200264 |journal= Technology and Disability |volume=32| issue=3|pages= 179–197 |doi= 10.3233/TAD-200264 |arxiv= 2005.11994 |s2cid= 218870304 |access-date= 5 August 2020}}</ref> and powered wheelchairs.<ref>{{cite journal |last1 = Eid |first1= M.A. |last2= Giakoumidis |first2= N. |last3= El Saddik |first3= A. |date=July 2016|title=A novel eye-gaze-controlled wheelchair system for navigating unknown environments: case study with a person with ALS|journal= IEEE Access|volume=4| pages= 558–573|doi= 10.1109/ACCESS.2016.2520093 |bibcode= 2016IEEEA...4..558E |s2cid= 28210837 |doi-access= free }}</ref> Eye tracking is also helpful in analysing visual search patterns,<ref>{{cite journal |last1 = Jeevithashree |first1= D. V. |last2=Saluja |first2= K.S. |last3= Biswas | first3= Pradipta |date=December 2019|title= A case study of developing gaze-controlled interface for users with severe speech and motor impairment| url=https://content.iospress.com/articles/technology-and-disability/tad180206 | journal= Technology and Disability |volume= 31 |issue= 1–2 |pages= 63–76|doi= 10.3233/TAD-180206 |s2cid= 199083245 |access-date= 5 August 2020|url-access= subscription }}</ref> detecting presence of [[Nystagmus]] and detecting early signs of learning disability by analysing eye gaze movement during reading.<ref>{{cite journal|last1 = Jones|first1= M.W.|last2= Obregón|first2= M.|last3= Kelly |first3= M.L.|last4= Branigan |first4= H.P.|date=May 2008|title=Elucidating the component processes involved in dyslexic and non-dyslexic reading fluency: An eye-tracking study | url= https://www.sciencedirect.com/science/article/abs/pii/S0010027708002230|journal= Cognition|volume= 109| issue=3|pages= 389–407|doi= 10.1016/j.cognition.2008.10.005|pmid= 19019349|s2cid= 29389144|access-date=5 August 2020|url-access= subscription}}</ref> === Aviation applications === Eye tracking has already been studied for flight safety by comparing scan paths and fixation duration to evaluate the progress of pilot trainees,<ref>{{cite journal |last1 = Calhoun|first1= G. L|last2= Janson | date= 1991|title=Eye line-of-sight control compared to manual selection of discrete switches|journal= Armstrong Laboratory Report AL-TR-1991-0015 }}</ref> for estimating pilots' skills,<ref>{{cite journal |last1 = Fitts|first1= P.M.|last2=Jones|first2= R.E.|last3= Milton|first3= J.L | date= 1950|title=Eye movements of aircraft pilots during instrument-landing approaches|journal=Aeronaut. Eng. Rev. |access-date= 20 July 2020| url=https://psycnet.apa.org/record/1950-05519-001}}</ref> for analyzing crew's joint attention and shared situational awareness.<ref>{{cite journal |last1=Peysakhovich |first1=V. |last2=Lefrançois |first2=O. |last3=Dehais |first3=F. |last4=Causse |first4=M. |title=The neuroergonomics of aircraft cockpits: the four stages of eye-tracking integration to enhance flight safety. |journal=Safety |date=2018 |volume=4 |issue=1|page=8 |doi=10.3390/safety4010008 |doi-access=free }}</ref> Eye tracking technology was also explored to interact with helmet mounted display systems<ref name="deReus2012"/> and multi-functional displays<ref>{{cite journal |last1 = DV |first1= JeevithaShree|last2= Murthy |first2= L R.D.|last3= Saluja |first3= K. S.|last4= Biswas | first4= P. | date= 2018|title=Operating different displays in military fast jets using eye gaze tracker |journal=Journal of Aviation Technology and Engineering|volume=8|issue=4|access-date= 24 July 2020 |url=https://docs.lib.purdue.edu/jate/vol8/iss1/4/}}</ref> in military aircraft. Studies were conducted to investigate the utility of eye tracker for Head-up target locking and Head-up target acquisition in Helmet mounted display systems (HMDS).<ref name="deReus2012">{{cite journal |last1=de Reus |first1=A.J.C. |last2=Zon |first2=R. |last3=Ouwerkerk |first3=R. |title=Exploring the use of an eye tracker in a helmet mounted display |journal=National Aerospace Laboratory Technical Report NLR-TP-2012-001 |date=November 2012}}</ref> Pilots' feedback suggested that even though the technology is promising, its hardware and software components are yet to be matured.<ref name="deReus2012" /> Research on interacting with multi-functional displays in simulator environment showed that eye tracking can improve the response times and perceived cognitive load significantly over existing systems. Further, research also investigated utilizing measurements of fixation and pupillary responses to estimate pilot's cognitive load. Estimating cognitive load can help to design next generation adaptive cockpits with improved flight safety.<ref>{{cite journal |last1 = Babu|first1= M.|last2=D V|first2= JeevithaShree|last3= Prabhakar|first3= G. |last4= Saluja|first4= K.P.|last5= Pashilkar| first5= A.|last6= Biswas| first6= P.|date=2019|title=Estimating pilots' cognitive load from ocular parameters through simulation and in-flight studies|journal=Journal of Eye Movement Research|volume=12|issue=3|doi= 10.16910/jemr.12.3.3|pmid= 33828735|pmc= 7880144|access-date= 3 August 2020 |url=https://bop.unibe.ch/JEMR/article/view/JEMR.12.3.3}}</ref> Eye tracking is also useful for detecting pilot fatigue.<ref>{{cite journal |last1=Peißl |first1=S. |last2=Wickens |first2=C. D. |last3=Baruah |first3=R. |title=Eye-tracking measures in aviation: A selective literature review |journal=The International Journal of Aerospace Psychology |date=2018 |volume=28 |issue=3–4|pages=98–112 |doi=10.1080/24721840.2018.1514978 |s2cid=70016458 |doi-access=free }}</ref><ref name="bop.unibe.ch"/> ===Automotive applications=== In recent time, eye tracking technology is investigated in automotive domain in both passive and active ways. [[National Highway Traffic Safety Administration]] measured glance duration for undertaking secondary tasks while driving and used it to promote safety by discouraging the introduction of excessively distracting devices in vehicles<ref>{{cite web | title= Visual-Manual NHTSA Driver Distraction Guidelines for In-Vehicle Electronic Devices|date=26 April 2013 |url=https://www.federalregister.gov/documents/2013/04/26/2013-09883/visual-manual-nhtsa-driver-distraction-guidelines-for-in-vehicle-electronic-devices}}</ref> Eye tracking is being explored as a potential method to control IVIS (In-Vehicle Infotainment Systems), the multimedia and navigation systems frequently present in contemporary cars.<ref>{{cite patent |country=US |number=8928585B2 |status=patent |title=Eye tracking control of vehicle entertainment systems |gdate=2015-01-06 |fdate=2012-09-06 |pridate=2011-09-09 |invent1=Mondragon, Christopher K. |invent2=Bleacher, Brett |assign1=Thales Avionics Inc |url=https://patents.google.com/patent/US8928585B2/en}}</ref> Though initial research<ref>{{cite book |last1 = Poitschke |first1= T.|last2= Laquai|first2= F.|last3= Stamboliev |first3= S. |last4= Rigoll|first4= G. |title= 2011 IEEE International Conference on Systems, Man, and Cybernetics|chapter= Gaze-based interaction on multiple displays in an automotive environment|date=2011|pages= 543–548 |doi= 10.1109/ICSMC.2011.6083740|isbn= 978-1-4577-0653-0|s2cid= 9362329|issn= 1062-922X |chapter-url= http://mediatum.ub.tum.de/doc/1107278/document.pdf}}</ref> investigated the efficacy of eye tracking system for interaction with HDD (Head Down Display), it still required drivers to take their eyes off the road while performing a secondary task. Recent studies investigated eye gaze controlled interaction with HUD (Head Up Display) that eliminates eyes-off-road distraction.<ref>{{cite journal |last1 = Prabhakar|first1= G.|last2=Ramakrishnan|first2= A.|last3= Murthy|first3= L. |last4= Sharma|first4=V.K.|last5= Madan| first5= M.|last6=Deshmukh |first6= S.|last7= Biswas| first7= P. |title=Interactive Gaze & Finger controlled HUD for Cars|journal=Journal of Multimodal User Interface|year= 2020|volume= 14|pages= 101–121|doi= 10.1007/s12193-019-00316-9|s2cid= 208261516}}</ref> Eye tracking is also used to monitor cognitive load of drivers to detect potential distraction. Though researchers<ref>{{cite book |last1 = Marshall |first1= S. |title= Proceedings of the IEEE 7th Conference on Human Factors and Power Plants |chapter= The Index of Cognitive Activity: Measuring cognitive workload |date= 2002|pages= 7-5-7-9 |doi= 10.1109/HFPP.2002.1042860 |isbn= 0-7803-7450-9 |s2cid= 44561112 }}</ref> explored different methods to estimate [[cognitive load]] of drivers from different physiological parameters, usage of ocular parameters explored a new way to use the existing eye trackers to monitor cognitive load of drivers in addition to interaction with IVIS.<ref>{{cite book |last1 = Duchowski|first1= A. T.|last2=Biele|first2= C.|last3= Niedzielska|first3= A. |last4= Krejtz|first4= K.|last5= Krejtz| first5= I.|last6= Kiefer| first6= P.|last7=Raubal |first7= M.|last8= Giannopoulos|first8= I.|chapter= The Index of Pupillary Activity: Measuring Cognitive Load ''vis-à-vis'' Task Difficulty with Pupil Oscillation|date=2018|title=Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems|series= Chi '18|pages= 1–13|doi= 10.1145/3173574.3173856|s2cid= 5064488|doi-access= free|isbn= 978-1-4503-5620-6}}</ref><ref>{{cite journal |last1 = Prabhakar|first1= G.|last2=Mukhopadhyay|first2= A.|last3= Murthy |first3= L. |last4= Modiksha|first4= M. A. D. A. N.|last5= Biswas| first5= P.|date=2020|title=Cognitive load estimation using Ocular Parameters in Automotive|journal=Transportation Engineering|volume= 2|page= 100008|doi= 10.1016/j.treng.2020.100008|doi-access= free}}</ref> ===Entertainment applications=== The 2021 video game [[Before Your Eyes]] registers and reads the player's blinking, and uses it as the main way of interacting with the game.<ref>{{cite web |last=McGuire |first=Keegan |website=[[looper.com]] |date=2021-04-08 |title=What The Critics Are Saying About Before Your Eyes |url=https://www.looper.com/377254/what-the-critics-are-saying-about-before-your-eyes/ |archive-url=https://web.archive.org/web/20210423200906/https://www.looper.com/377254/what-the-critics-are-saying-about-before-your-eyes/ |archive-date=2021-04-23 |url-status=live }}</ref><ref>{{cite web |last=von Au |first=Caspar |date=2021-04-24 |work=[[Bayerischer Rundfunk]] |language=de |title=Computerspiel "Before Your Eyes" wird mit den Augen gesteuert |trans-title=Video game "Before Your Eyes" is controlled with your eyes |url=https://www.br.de/nachrichten/kultur/computerspiel-before-your-eyes-wird-mit-den-augen-gesteuert,SVPcdxN |archive-url=https://web.archive.org/web/20210426135949/https://www.br.de/nachrichten/kultur/computerspiel-before-your-eyes-wird-mit-den-augen-gesteuert,SVPcdxN |archive-date=2021-04-26 |url-status=live }}</ref> ===Engineering applications=== The widespread use of eye-tracking technology has shed light to its use in empirical software engineering in the most recent years. The eye-tracking technology and data analysis techniques are used to investigate the understandability of software engineering concepts by the researchers. These include the understandability of business process models,<ref>{{Cite journal|last1=Petrusel|first1=Razvan|last2=Mendling|first2=Jan|last3=Reijers|first3=Hajo A.|date=2017|title=How visual cognition influences process model comprehension|url=https://www.infona.pl//resource/bwmeta1.element.elsevier-54b21976-68ee-30d8-ab5f-4de5109c8a26|journal=Decision Support Systems|language=English|volume=C|issue=96|pages=1–16|doi=10.1016/j.dss.2017.01.005|issn=0167-9236|url-access=subscription}}</ref> and diagrams used in software engineering such as [[Activity diagram|UML activity diagrams]] and [[Entity–relationship model|EER diagrams]].<ref>{{cite journal | last1=Sözen | first1=Nergiz | last2=Say | first2=Bilge | last3=Kılıç | first3=Özkan | title=An Experimental Study Towards Investigating the Effect of Working Memory Capacity on Complex Diagram Understandability | journal=TEM Journal | publisher=Association for Information Communication Technology Education and Science | date=27 November 2020 | issn=2217-8333 | doi=10.18421/tem94-09 | s2cid=229386117 | pages=1384–1395| doi-access=free }}</ref> Eye-tracking metrics such as fixation, scan-path, scan-path precision, scan-path recall, fixations on area of interest/relevant region are computed, analyzed and interpreted in terms of model and diagram understandability. The findings are used to enhance the understandability of diagrams and models with proper model related solutions and by improving personal related factors such as working-memory capacity, [[Cognitive load|cognitive-load]], [[Learning styles|learning style]] and strategy of the software engineers and modelers. === Cartographic applications === [[Cartography|Cartographic]] research has widely adopted eye tracking techniques. Researchers have used them to see how individuals perceive and interpret [[Map|maps]].<ref>{{Cite journal |last1=Krassanakis |first1=Vassilios |last2=Cybulski |first2=Paweł |date=2021-06-14 |title=Eye Tracking Research in Cartography: Looking into the Future |journal=ISPRS International Journal of Geo-Information |language=en |volume=10 |issue=6 |pages=411 |doi=10.3390/ijgi10060411 |bibcode=2021IJGI...10..411K |issn=2220-9964 |doi-access=free }}</ref> For example, eye tracking has been used to study differences in perception of 2D and 3D visualization,<ref>{{Cite journal |last1=Popelka |first1=Stanislav |last2=Brychtova |first2=Alzbeta |date=2013 |title=Eye-tracking Study on Different Perception of 2D and 3D Terrain Visualisation |url=http://www.tandfonline.com/doi/full/10.1179/1743277413Y.0000000058 |journal=[[The Cartographic Journal]] |language=en |volume=50 |issue=3 |pages=240–246 |doi=10.1179/1743277413Y.0000000058 |bibcode=2013CartJ..50..240P |s2cid=128975149 |issn=0008-7041|url-access=subscription }}</ref><ref>{{Cite journal |last1=Herman |first1=Lukas |last2=Popelka |first2=Stanislav |last3=Hejlova |first3=Vendula |date=2017-05-31 |title=Eye-tracking Analysis of Interactive 3D Geovisualization |url=https://bop.unibe.ch/JEMR/article/view/3533 |journal=Journal of Eye Movement Research |volume=10 |issue=3 |doi=10.16910/jemr.10.3.2 |issn=1995-8692 |pmc=7141050 |pmid=33828655}}</ref> comparison of map reading strategies between novices and experts<ref>{{Cite journal |last1=Ooms |first1=K. |last2=De Maeyer |first2=P. |last3=Fack |first3=V. |date=2013-11-22 |title=Study of the attentive behavior of novice and expert map users using eye tracking |url=http://dx.doi.org/10.1080/15230406.2013.860255 |journal=Cartography and Geographic Information Science |volume=41 |issue=1 |pages=37–54 |doi=10.1080/15230406.2013.860255 |hdl=1854/LU-4252541 |s2cid=11087520 |issn=1523-0406|hdl-access=free }}</ref> or students and their geography teachers,<ref>{{Cite journal |last1=Beitlova |first1=Marketa |last2=Popelka |first2=Stanislav |last3=Vozenilek |first3=Vit |date=2020-08-19 |title=Differences in Thematic Map Reading by Students and Their Geography Teacher |journal=ISPRS International Journal of Geo-Information |volume=9 |issue=9 |pages=492 |doi=10.3390/ijgi9090492 |bibcode=2020IJGI....9..492B |issn=2220-9964 |doi-access=free }}</ref> and evaluation of the cartographic quality of maps.<ref>{{Cite journal |last1=Burian |first1=Jaroslav |last2=Popelka |first2=Stanislav |last3=Beitlova |first3=Marketa |date=2018-05-17 |title=Evaluation of the Cartographical Quality of Urban Plans by Eye-Tracking |journal=ISPRS International Journal of Geo-Information |volume=7 |issue=5 |pages=192 |doi=10.3390/ijgi7050192 |bibcode=2018IJGI....7..192B |issn=2220-9964 |doi-access=free }}</ref> Besides, cartographers have employed eye tracking to investigate various factors affecting map reading, including attributes such as color or symbol density.<ref>{{Cite journal |last1=Brychtova |first1=Alzbeta |last2=Coltekin |first2=Arzu |date=2016-06-30 |title=An Empirical User Study for Measuring the Influence of Colour Distance and Font Size in Map Reading Using Eye Tracking |url=http://dx.doi.org/10.1179/1743277414y.0000000103 |journal=The Cartographic Journal |volume=53 |issue=3 |pages=202–212 |doi=10.1179/1743277414y.0000000103 |bibcode=2016CartJ..53..202B |s2cid=18911777 |issn=0008-7041|url-access=subscription }}</ref><ref>{{Cite journal |last=Cybulski |first=Paweł |date=2020-01-09 |title=Spatial distance and cartographic background complexity in graduated point symbol map-reading task |url=http://dx.doi.org/10.1080/15230406.2019.1702102 |journal=Cartography and Geographic Information Science |volume=47 |issue=3 |pages=244–260 |doi=10.1080/15230406.2019.1702102 |bibcode=2020CGISc..47..244C |s2cid=213161788 |issn=1523-0406|url-access=subscription }}</ref> Numerous studies about the usability of map applications took advantage of eye tracking, too.<ref>{{Cite journal |last1=Manson |first1=Steven M. |last2=Kne |first2=Len |last3=Dyke |first3=Kevin R. |last4=Shannon |first4=Jerry |last5=Eria |first5=Sami |date=2012 |title=Using Eye-tracking and Mouse Metrics to Test Usability of Web Mapping Navigation |url=http://dx.doi.org/10.1559/1523040639148 |journal=Cartography and Geographic Information Science |volume=39 |issue=1 |pages=48–60 |doi=10.1559/1523040639148 |bibcode=2012CGISc..39...48M |s2cid=131449617 |issn=1523-0406|url-access=subscription }}</ref><ref>{{Cite journal |last1=Popelka |first1=Stanislav |last2=Vondrakova |first2=Alena |last3=Hujnakova |first3=Petra |date=2019-05-30 |title=Eye-tracking Evaluation of Weather Web Maps |journal=ISPRS International Journal of Geo-Information |volume=8 |issue=6 |pages=256 |doi=10.3390/ijgi8060256 |bibcode=2019IJGI....8..256P |issn=2220-9964 |doi-access=free }}</ref> The cartographic community's daily engagement with visual and spatial data positioned it to contribute significantly to eye tracking data visualization methods and tools.<ref name=":2">{{Cite journal |last1=Vojtechovska |first1=Michaela |last2=Popelka |first2=Stanislav |date=2023-08-12 |title=GazePlotter – tool for eye movement sequences visualization |journal=Abstracts of the ICA |volume=6 |pages=264– |doi=10.5194/ica-abs-6-264-2023 |bibcode=2023AbICA...6..264V |issn=2570-2106 |doi-access=free }}</ref> For example, cartographers have developed methods for integrating eye tracking data with [[GIS]], utilizing GIS software for further visualization and analysis.<ref>{{Cite journal |last1=Sultan |first1=Minha Noor |last2=Popelka |first2=Stanislav |last3=Strobl |first3=Josef |date=2022-06-24 |title=ET2Spatial – software for georeferencing of eye movement data |url=http://dx.doi.org/10.1007/s12145-022-00832-5 |journal=Earth Science Informatics |volume=15 |issue=3 |pages=2031–2049 |doi=10.1007/s12145-022-00832-5 |bibcode=2022EScIn..15.2031S |s2cid=249961269 |issn=1865-0473|url-access=subscription }}</ref><ref>{{Cite journal |last1=Göbel |first1=Fabian |last2=Kiefer |first2=Peter |last3=Raubal |first3=Martin |date=2019-05-02 |title=Correction to: FeaturEyeTrack: automatic matching of eye tracking data with map features on interactive maps |journal=GeoInformatica |volume=24 |issue=4 |pages=1061–1062 |doi=10.1007/s10707-019-00352-3 |s2cid=155184852 |issn=1384-6175|doi-access=free }}</ref> The community has also delivered tools for visualizing eye tracking data<ref>{{Cite journal |last1=Dolezalova |first1=Jitka |last2=Popelka |first2=Stanislav |date=2016-08-05 |title=ScanGraph: A Novel Scanpath Comparison Method Using Visualisation of Graph Cliques |url=https://bop.unibe.ch/JEMR/article/view/2522 |journal=Journal of Eye Movement Research |volume=9 |issue=4 |doi=10.16910/jemr.9.4.5 |issn=1995-8692|doi-access=free }}</ref><ref name=":2" /> or a toolbox for the identification of eye fixations based on a spatial component of eye-tracking data.<ref>{{Cite journal |last1=Krassanakis |first1=Vassilios |last2=Filippakopoulou |first2=Vassiliki |last3=Nakos |first3=Byron |date=2014-02-21 |title=EyeMMV toolbox: An eye movement post-analysis tool based on a two-step spatial dispersion threshold for fixation identification |journal=Journal of Eye Movement Research |volume=7 |issue=1 |doi=10.16910/jemr.7.1.1 |s2cid=38319871 |issn=1995-8692|doi-access=free }}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)