Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Brain–computer interface
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Animal research== {{See also|Remote control animal}} Several laboratories have managed to read signals from monkey and rat [[cerebral cortices]] to operate BCIs to produce movement. Monkeys have moved [[Cursor (computers)|computer cursors]] and commanded robotic arms to perform simple tasks simply by thinking about the task and seeing the results, without motor output.<ref>Miguel Nicolelis et al. (2001) [http://www.dukemedicine.org/AboutUs/Facts_and_Statistics/historical_highlights/index/view Duke neurobiologist has developed system that allows monkeys to control robot arms via brain signals] {{webarchive |url=https://web.archive.org/web/20081219060005/http://www.dukemedicine.org/AboutUs/Facts_and_Statistics/historical_highlights/index/view |date=19 December 2008 }}</ref> In May 2008 photographs that showed a monkey at the [[University of Pittsburgh Medical Center]] operating a robotic arm by thinking were published in multiple studies.<ref>{{cite web| vauthors = Baum M | title = Monkey Uses Brain Power to Feed Itself With Robotic Arm| publisher = Pitt Chronicle| date = 6 September 2008| url = http://www.chronicle.pitt.edu/?p=1478| access-date = 6 July 2009| url-status = dead| archive-url = https://web.archive.org/web/20090910034547/http://www.chronicle.pitt.edu/?p=1478| archive-date = 10 September 2009| df = dmy-all}}</ref> Sheep have also been used to evaluate BCI technology including Synchron's Stentrode. In 2020, [[Elon Musk]]'s [[Neuralink]] was successfully implanted in a pig.<ref>{{cite web | vauthors = Lewis T |title= Elon Musk's Pig-Brain Implant Is Still a Long Way from 'Solving Paralysis' |url= https://www.scientificamerican.com/article/elon-musks-pig-brain-implant-is-still-a-long-way-from-solving-paralysis/ |website=[[Scientific American]] |date= November 2020 |access-date=23 March 2021}}</ref> In 2021, Musk announced that the company had successfully enabled a monkey to play video games using Neuralink's device.<ref>{{cite web | vauthors = Shead S |title=Elon Musk says his start-up Neuralink has wired up a monkey to play video games using its mind |url=https://www.cnbc.com/2021/02/01/elon-musk-neuralink-wires-up-monkey-to-play-video-games-using-mind.html |website=CNBC |date=February 2021 |access-date=23 March 2021}}</ref> ===Early work=== [[File:Monkey using a robotic arm.jpg|thumb|Monkey operating a robotic arm with brain–computer interfacing (Schwartz lab, University of Pittsburgh)]] In 1969 [[operant conditioning]] studies by Fetz et al. at the Regional Primate Research Center and Department of Physiology and Biophysics, [[University of Washington School of Medicine]] showed that monkeys could learn to control the deflection of a [[biofeedback]] arm with neural activity.<ref>{{cite journal | vauthors = Fetz EE | title = Operant conditioning of cortical unit activity | journal = Science | volume = 163 | issue = 3870 | pages = 955–958 | date = February 1969 | pmid = 4974291 | doi = 10.1126/science.163.3870.955 | s2cid = 45427819 | bibcode = 1969Sci...163..955F }}</ref> Similar work in the 1970s established that monkeys could learn to control the firing rates of individual and multiple neurons in the primary [[motor cortex]] if they were rewarded accordingly.<ref>{{cite journal | vauthors = Schmidt EM, McIntosh JS, Durelli L, Bak MJ | title = Fine control of operantly conditioned firing patterns of cortical neurons | journal = Experimental Neurology | volume = 61 | issue = 2 | pages = 349–369 | date = September 1978 | pmid = 101388 | doi = 10.1016/0014-4886(78)90252-2 | s2cid = 37539476 }}</ref> [[Algorithms]] to reconstruct movements from [[motor cortex]] [[neurons]], which control movement, date back to the 1970s. In the 1980s, Georgopoulos at [[Johns Hopkins University]] found a mathematical relationship between the electrical responses of single motor cortex neurons in [[rhesus macaque|rhesus macaque monkeys]] and the direction in which they moved their arms. He also found that dispersed groups of neurons, in different areas of the monkey's brains, collectively controlled motor commands. He was able to record the firings of neurons in only one area at a time, due to equipment limitations.<ref>{{cite journal | vauthors = Georgopoulos AP, Lurito JT, Petrides M, Schwartz AB, Massey JT | title = Mental rotation of the neuronal population vector | journal = Science | volume = 243 | issue = 4888 | pages = 234–236 | date = January 1989 | pmid = 2911737 | doi = 10.1126/science.2911737 | s2cid = 37161168 | bibcode = 1989Sci...243..234G }}</ref> Several groups have been able to capture complex brain motor cortex signals by recording from [[neural ensemble]]s (groups of neurons) and using these to control external devices.{{Citation needed|date=May 2024}} ===Research === ====Kennedy and Yang Dan==== Phillip Kennedy (Neural Signals founder (1987) and colleagues built the first intracortical brain–computer interface by implanting neurotrophic-cone [[electrodes]] into monkeys.{{Citation needed|date=February 2012}}[[File:LGN Cat Vison Recording.jpg|thumb|Yang Dan and colleagues' recordings of cat vision using a BCI implanted in the [[lateral geniculate nucleus]] (top row: original image; bottom row: recording)]]In 1999, [[Yang Dan (neuroscientist)|Yang Dan]] et al. at [[University of California, Berkeley]] decoded neuronal firings to reproduce images from cats. The team used an array of electrodes embedded in the [[thalamus]] (which integrates the brain's sensory input). Researchers targeted 177 brain cells in the thalamus [[lateral geniculate nucleus]] area, which decodes signals from the [[retina]]. Neuron firings were recorded from watching eight short movies. Using mathematical filters, the researchers decoded the signals to reconstruct recognizable scenes and moving objects.<ref>{{cite journal | vauthors = Stanley GB, Li FF, Dan Y | title = Reconstruction of natural scenes from ensemble responses in the lateral geniculate nucleus | journal = The Journal of Neuroscience | volume = 19 | issue = 18 | pages = 8036–8042 | date = September 1999 | pmid = 10479703 | pmc = 6782475 | doi = 10.1523/JNEUROSCI.19-18-08036.1999 }}</ref> ====Nicolelis==== {{See also|Walk Again Project}} [[Duke University]] professor [[Miguel Nicolelis]] advocates using multiple electrodes spread over a greater area of the brain to obtain neuronal signals. After initial studies in rats during the 1990s, Nicolelis and colleagues developed BCIs that decoded brain activity in [[owl monkeys]] and used the devices to reproduce monkey movements in robotic arms. Monkeys' advanced reaching and grasping abilities and hand manipulation skills, made them good test subjects. By 2000, the group succeeded in building a BCI that reproduced owl monkey movements while the monkey operated a [[joystick]] or reached for food.<ref>{{cite journal | vauthors = Wessberg J, Stambaugh CR, Kralik JD, Beck PD, Laubach M, Chapin JK, Kim J, Biggs SJ, Srinivasan MA, Nicolelis MA | display-authors = 6 | title = Real-time prediction of hand trajectory by ensembles of cortical neurons in primates | journal = Nature | volume = 408 | issue = 6810 | pages = 361–365 | date = November 2000 | pmid = 11099043 | doi = 10.1038/35042582 | s2cid = 795720 | bibcode = 2000Natur.408..361W }}</ref> The BCI operated in real time and could remotely control a separate robot. But the monkeys received no feedback ([[open-loop controller|open-loop]] BCI). [[File:Brain-computer interface (schematic).jpg|thumb|Diagram of the BCI developed by Miguel Nicolelis and colleagues for use on [[rhesus monkeys]]]] Later experiments on [[rhesus monkeys]] included [[feedback]] and reproduced monkey reaching and grasping movements in a robot arm. Their deeply cleft and furrowed brains made them better models for human [[neurophysiology]] than owl monkeys. The monkeys were trained to reach and grasp objects on a computer screen by manipulating a joystick while corresponding movements by a robot arm were hidden.<ref name=carmena2003>{{cite journal | vauthors = Carmena JM, Lebedev MA, Crist RE, O'Doherty JE, Santucci DM, Dimitrov DF, Patil PG, Henriquez CS, Nicolelis MA | display-authors = 6 | title = Learning to control a brain-machine interface for reaching and grasping by primates | journal = PLOS Biology | volume = 1 | issue = 2 | pages = E42 | date = November 2003 | pmid = 14624244 | pmc = 261882 | doi = 10.1371/journal.pbio.0000042 | doi-access = free }}</ref><ref name=lebedev2005>{{cite journal | vauthors = Lebedev MA, Carmena JM, O'Doherty JE, Zacksenhouse M, Henriquez CS, Principe JC, Nicolelis MA | title = Cortical ensemble adaptation to represent velocity of an artificial actuator controlled by a brain-machine interface | journal = The Journal of Neuroscience | volume = 25 | issue = 19 | pages = 4681–4693 | date = May 2005 | pmid = 15888644 | pmc = 6724781 | doi = 10.1523/JNEUROSCI.4088-04.2005 }}</ref> The monkeys were later shown the robot and learned to control it by viewing its movements. The BCI used velocity predictions to control reaching movements and simultaneously predicted [[Grip strength|gripping force]]. In 2011 O'Doherty and colleagues showed a BCI with sensory feedback with rhesus monkeys. The monkey controlled the position of an avatar arm while receiving sensory feedback through direct [[Cortical stimulation mapping|intracortical stimulation (ICMS)]] in the arm representation area of the [[sensory cortex]].<ref name="Odoherty2003">{{cite journal | vauthors = O'Doherty JE, Lebedev MA, Ifft PJ, Zhuang KZ, Shokur S, Bleuler H, Nicolelis MA | title = Active tactile exploration using a brain-machine-brain interface | journal = Nature | volume = 479 | issue = 7372 | pages = 228–231 | date = October 2011 | pmid = 21976021 | pmc = 3236080 | doi = 10.1038/nature10489 | bibcode = 2011Natur.479..228O }}</ref> ====Donoghue, Schwartz, and Andersen==== [[File:164_Angell_Street.jpg|thumb|BCIs are a core focus of the [[Carney Institute for Brain Science]] at [[Brown University]]. ]] Other laboratories that have developed BCIs and algorithms that decode neuron signals include [[John Donoghue (neuroscientist)|John Donoghue]] at the [[Carney Institute for Brain Science]] at [[Brown University]], Andrew Schwartz at the [[University of Pittsburgh]], and [[Richard A. Andersen (neuroscientist)|Richard Andersen]] at [[Caltech]]. These researchers produced working BCIs using recorded signals from far fewer neurons than Nicolelis (15–30 neurons versus 50–200 neurons). The Carney Institute reported training rhesus monkeys to use a BCI to track visual targets on a computer screen (closed-loop BCI) with or without a joystick.<ref>{{cite journal | vauthors = Serruya MD, Hatsopoulos NG, Paninski L, Fellows MR, Donoghue JP | title = Instant neural control of a movement signal | journal = Nature | volume = 416 | issue = 6877 | pages = 141–142 | date = March 2002 | pmid = 11894084 | doi = 10.1038/416141a | s2cid = 4383116 | bibcode = 2002Natur.416..141S }}</ref> The group created a BCI for three-dimensional tracking in virtual reality and reproduced BCI control in a robotic arm.<ref>{{cite journal | vauthors = Taylor DM, Tillery SI, Schwartz AB | title = Direct cortical control of 3D neuroprosthetic devices | journal = Science | volume = 296 | issue = 5574 | pages = 1829–1832 | date = June 2002 | pmid = 12052948 | doi = 10.1126/science.1070291 | s2cid = 9402759 | citeseerx = 10.1.1.1027.4335 | bibcode = 2002Sci...296.1829T }}</ref> The same group demonstrated that a monkey could feed itself pieces of fruit and marshmallows using a robotic arm controlled by the animal's brain signals.<ref>[http://www.pittsburghlive.com:8000/x/tribunereview/s_469059.html Pitt team to build on brain-controlled arm] {{webarchive |url=https://web.archive.org/web/20070704125118/http://www.pittsburghlive.com:8000/x/tribunereview/s_469059.html |date=4 July 2007 }}, ''Pittsburgh Tribune Review'', 5 September 2006.</ref><ref>{{YouTube|wxIgdOlT2cY}}</ref><ref>{{cite journal | vauthors = Velliste M, Perel S, Spalding MC, Whitford AS, Schwartz AB | title = Cortical control of a prosthetic arm for self-feeding | journal = Nature | volume = 453 | issue = 7198 | pages = 1098–1101 | date = June 2008 | pmid = 18509337 | doi = 10.1038/nature06996 | s2cid = 4404323 | bibcode = 2008Natur.453.1098V | url = https://zenodo.org/record/891045 }}</ref> Andersen's group used recordings of [[premovement neuronal activity|premovement activity]] from the [[posterior parietal cortex]], including signals created when experimental animals anticipated receiving a reward.<ref>{{cite journal | vauthors = Musallam S, Corneil BD, Greger B, Scherberger H, Andersen RA | title = Cognitive control signals for neural prosthetics | journal = Science | volume = 305 | issue = 5681 | pages = 258–262 | date = July 2004 | pmid = 15247483 | doi = 10.1126/science.1097938 | s2cid = 3112034 | bibcode = 2004Sci...305..258M | url = https://resolver.caltech.edu/CaltechAUTHORS:20141121-110153014 }}</ref> ====Other research==== In addition to predicting [[kinematic]] and [[kinetic energy|kinetic]] parameters of limb movements, BCIs that predict [[electromyographic]] or electrical activity of the muscles of primates are in process.<ref>{{cite journal | vauthors = Santucci DM, Kralik JD, Lebedev MA, Nicolelis MA | title = Frontal and parietal cortical ensembles predict single-trial muscle activity during reaching movements in primates | journal = The European Journal of Neuroscience | volume = 22 | issue = 6 | pages = 1529–1540 | date = September 2005 | pmid = 16190906 | doi = 10.1111/j.1460-9568.2005.04320.x | s2cid = 31277881 }}</ref> Such BCIs could restore mobility in paralyzed limbs by electrically stimulating muscles. Nicolelis and colleagues demonstrated that large neural ensembles can predict arm position. This work allowed BCIs to read arm movement intentions and translate them into actuator movements. Carmena and colleagues<ref name=carmena2003/> programmed a BCI that allowed a monkey to control reaching and grasping movements by a robotic arm. Lebedev and colleagues argued that brain networks reorganize to create a new representation of the robotic appendage in addition to the representation of the animal's own limbs.<ref name="lebedev2005" /> In 2019, a study reported a BCI that had the potential to help patients with speech impairment caused by neurological disorders. Their BCI used high-density [[electrocorticography]] to tap neural activity from a patient's brain and used [[deep learning]] to synthesize speech.<ref>{{cite journal | vauthors = Anumanchipalli GK, Chartier J, Chang EF | title = Speech synthesis from neural decoding of spoken sentences | journal = Nature | volume = 568 | issue = 7753 | pages = 493–498 | date = April 2019 | pmid = 31019317 | doi = 10.1038/s41586-019-1119-1 | pmc = 9714519 | s2cid = 129946122 | bibcode = 2019Natur.568..493A }}</ref><ref>{{cite journal | vauthors = Pandarinath C, Ali YH | title = Brain implants that let you speak your mind | language = EN | journal = Nature | volume = 568 | issue = 7753 | pages = 466–467 | date = April 2019 | pmid = 31019323 | doi = 10.1038/d41586-019-01181-y | doi-access = free | bibcode = 2019Natur.568..466P }}</ref> In 2021, those researchers reported the potential of a BCI to decode words and sentences in an [[anarthric]] patient who had been unable to speak for over 15 years.<ref name="Neuroprosthesis for Decoding Speech">{{cite journal | vauthors = Moses DA, Metzger SL, Liu JR, Anumanchipalli GK, Makin JG, Sun PF, Chartier J, Dougherty ME, Liu PM, Abrams GM, Tu-Chan A, Ganguly K, Chang EF | display-authors = 6 | title = Neuroprosthesis for Decoding Speech in a Paralyzed Person with Anarthria | journal = The New England Journal of Medicine | volume = 385 | issue = 3 | pages = 217–227 | date = July 2021 | pmid = 34260835 | doi = 10.1056/NEJMoa2027540 | pmc = 8972947 | s2cid = 235907121 }}</ref><ref>Belluck, Pam (14 July 2021). [https://www.nytimes.com/2021/07/14/health/speech-brain-implant-computer.html "Tapping Into the Brain to Help a Paralyzed Man Speak"]. ''The New York Times''.</ref> The biggest impediment to BCI technology is the lack of a sensor modality that provides safe, accurate and robust access to brain signals. The use of a better sensor expands the range of communication functions that can be provided using a BCI. Development and implementation of a BCI system is complex and time-consuming. In response to this problem, Gerwin Schalk has been developing [[BCI2000]], a general-purpose system for BCI research, since 2000.<ref>{{cite web|url=https://www.neurotechcenter.org/publications/2010/using-bci2000-bci-research|title=Using BCI2000 in BCI Research|publisher=National Center for Adaptive Neurotechnology|accessdate=5 December 2023}}</ref> A new 'wireless' approach uses [[light-gated ion channel]]s such as [[channelrhodopsin]] to control the activity of genetically defined subsets of neurons ''[[in vivo]]''. In the context of a simple learning task, illumination of [[transfected]] cells in the [[Somatosensory system|somatosensory cortex]] influenced decision-making in mice.<ref>{{cite journal | vauthors = Huber D, Petreanu L, Ghitani N, Ranade S, Hromádka T, Mainen Z, Svoboda K|author7-link=Karel Svoboda (scientist) | title = Sparse optical microstimulation in barrel cortex drives learned behaviour in freely moving mice | journal = Nature | volume = 451 | issue = 7174 | pages = 61–64 | date = January 2008 | pmid = 18094685 | pmc = 3425380 | doi = 10.1038/nature06445 | bibcode = 2008Natur.451...61H }}</ref> BCIs led to a deeper understanding of neural networks and the [[central nervous system]]. Research has reported that despite neuroscientists' inclination to believe that neurons have the most effect when working together, single neurons can be conditioned through the use of BCIs to fire in a pattern that allows primates to control motor outputs. BCIs led to development of the single neuron insufficiency principle that states that even with a well-tuned firing rate, single neurons can only carry limited information and therefore the highest level of accuracy is achieved by recording ensemble firings. Other principles discovered with BCIs include the neuronal multitasking principle, the neuronal mass principle, the neural degeneracy principle, and the plasticity principle.<ref>{{cite journal | vauthors = Nicolelis MA, Lebedev MA | title = Principles of neural ensemble physiology underlying the operation of brain-machine interfaces | journal = Nature Reviews. Neuroscience | volume = 10 | issue = 7 | pages = 530–540 | date = July 2009 | pmid = 19543222 | doi = 10.1038/nrn2653 | s2cid = 9290258 }}</ref> BCIs are proposed to be applied by users without disabilities. Passive BCIs allow for assessing and interpreting changes in the user state during [[Human–computer interaction]] (HCI). In a secondary, implicit control loop, the system adapts to its user, improving its [[usability]].<ref name=":0">{{cite journal |vauthors=Zander TO, Kothe C |date=April 2011 |title=Towards passive brain-computer interfaces: applying brain-computer interface technology to human-machine systems in general |journal=Journal of Neural Engineering |volume=8 |issue=2 |pages=025005 |bibcode=2011JNEng...8b5005Z |doi=10.1088/1741-2560/8/2/025005 |pmid=21436512 |s2cid=37168897}}</ref> BCI systems can potentially be used to encode signals from the periphery. These sensory BCI devices enable real-time, behaviorally-relevant decisions based upon closed-loop neural stimulation.<ref>{{cite journal | vauthors = Richardson AG, Ghenbot Y, Liu X, Hao H, Rinehart C, DeLuccia S, Torres Maldonado S, Boyek G, Zhang M, Aflatouni F, Van der Spiegel J, Lucas TH | display-authors = 6 | title = Learning active sensing strategies using a sensory brain-machine interface | journal = Proceedings of the National Academy of Sciences of the United States of America | volume = 116 | issue = 35 | pages = 17509–17514 | date = August 2019 | pmid = 31409713 | pmc = 6717311 | doi = 10.1073/pnas.1909953116 | bibcode = 2019PNAS..11617509R | doi-access = free }}</ref> ====The BCI Award==== The [[Annual BCI Research Award|BCI Research Award]] is awarded annually in recognition of innovative research. Each year, a renowned research laboratory is asked to judge projects. The jury consists of BCI experts recruited by that laboratory. The jury selects twelve nominees, then chooses a first, second, and third-place winner, who receive awards of $3,000, $2,000, and $1,000, respectively.{{cn|date=April 2025}}
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)