Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Brain–computer interface
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==== Non-brain-based human–computer interface (physiological computing) ==== Human-computer interaction can exploit other recording modalities, such as [[electrooculography]] and eye-tracking. These modalities do not record brain activity and therefore do not qualify as BCIs.<ref>{{Cite journal |last=Fairclough |first=Stephen H. |date=January 2009 |title=Fundamentals of physiological computing |url=https://academic.oup.com/iwc/article-lookup/doi/10.1016/j.intcom.2008.10.011 |journal=Interacting with Computers |language=en |volume=21 |issue=1–2 |pages=133–145 |doi=10.1016/j.intcom.2008.10.011|s2cid=16314534 }}</ref> =====Electrooculography (EOG)===== In 1989, a study reported control of a mobile robot by eye movement using electrooculography signals. A mobile robot was driven to a goal point using five EOG commands, interpreted as forward, backward, left, right, and stop.<ref>{{cite book |title=Advances in Robot Design and Intelligent Control |vauthors=Bozinovski S |year=2017 |isbn=978-3-319-49057-1 |series=Advances in Intelligent Systems and Computing |volume=540 |pages=449–462 |chapter=Signal Processing Robotics Using Signals Generated by a Human Head: From Pioneering Works to EEG-Based Emulation of Digital Circuits |doi=10.1007/978-3-319-49058-8_49}}</ref> =====Pupil-size oscillation===== A 2016 article described a new non-EEG-based HCI that required no [[visual fixation]], or ability to move the eyes.<ref>{{cite journal |vauthors=Mathôt S, Melmi JB, van der Linden L, Van der Stigchel S |year=2016 |title=The Mind-Writing Pupil: A Human-Computer Interface Based on Decoding of Covert Attention through Pupillometry |journal=PLOS ONE |volume=11 |issue=2 |pages=e0148805 |bibcode=2016PLoSO..1148805M |doi=10.1371/journal.pone.0148805 |pmc=4743834 |pmid=26848745 |doi-access=free}}</ref> The interface is based on covert [[interest (emotion)|interest]]; directing attention to a chosen letter on a virtual keyboard, without the need to look directly at the letter. Each letter has its own (background) circle which micro-oscillates in brightness differently from the others. Letter selection is based on best fit between unintentional pupil-size oscillation and the background circle's brightness oscillation pattern. Accuracy is additionally improved by the user's mental rehearsal of the words 'bright' and 'dark' in synchrony with the brightness transitions of the letter's circle.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)