- Brain technology
Directing attention via machine
A new brain-machine interface demonstrates how humans direct their thoughts
Using a novel brain-machine interface linking neurons with a visual display, researchers reveal how humans pay attention to some things and ignore the wealth of distracting information surrounding them. The tool, described this week in Nature, may someday provide assistance to people with neurological impairments such as locked-in syndrome.
"This is the first I know of recording populations of individual neurons during an attention task [in humans]," said John Reynolds, a neurobiologist at the Salk Institute in La Jolla, California, who was not involved in the study. "It's really consistent with our thinking about the way attentional mechanisms work."
Humans' senses are being constantly stimulated by a wealth of information. "We don't know how the brain decides to attend to this thing and not that," said first author Moran Cerf, a postdoctoral fellow at the University of California, Los Angeles and the California Institute of Technology.
To find out, the researchers took advantage of a unique opportunity to study neurons firing in live human brains. To pinpoint the source of intractable epilepsy seizures, patients sit in a hospital with their medial temporal lobe -- a part of the brain that plays a key role in memory -- hooked up to electrodes, sometimes for ten days at a time, waiting for a seizure to occur. Cerf, together with neurosurgeon Itzhak Fried of UCLA and Christof Koch, a neuroscientist at CalTech, took advantage of the opportunity to record human brain activity in the temporal lobe. They invited patients to play a game in which they would control computer images by focusing their attention.
With 12 volunteers, the researchers first established links between specific neurons and familiar images -- of the Eiffel Towel, Bill Clinton, and Marilyn Monroe, for example -- by showing people pictures and noting which specific neurons became active. Once they identified at least four images that activated specific neurons, the researchers challenged the patients by showing a target image, say of Bill Clinton, and superimposing another image on top. They then asked the patient to pay attention to the target image and fade-out the competing image.
The team designed an automated system to relay neuron impulses to a computer controlling the images, the first of its kind. Hooked up to the machine, patients were able to enhance the target image onscreen in real time simply by directing their thoughts. In 69 percent of the images, patients successfully enhanced the target picture, fading out the competing image. "They got very good at it after a few minutes," said Cerf.
Supporting the prevailing model of how selective attention works in the brain, the researchers found that when attention is focused, the neuron related to a target image was firing and the neuron associated with the competing image was inhibited -- a mechanism known as biased competition. "We can actually see competition between neurons," said Cerf.
"Evolution endowed us with the ability to select what's important by attending to it," said Reynolds. Though subjects were presented with multiple visual stimuli, the temporal lobe was able to override that external input and direct its attention via internal thoughts. "Idealism tops realism," said Cerf. "What's in your mind can trump what reality offers."
The results are consistent with observations in non-human primate studies, added Reynolds, so researchers may now be able to draw more connections between what's happening in the brains of monkeys and humans during attention tasks.Additionally, the new technique may have practical applications. Past studies have validated brain-machine interfaces in which monkeys are able to control robots with the motor regions of the brain, but this new interface suggests such devices can be controlled by higher cognitive mechanisms such as attention.
If researchers can increase the number of neurons they know to be associated with specific objects from a handful to over 100, it might be possible to design a system to help patients with locked-in syndrome, in which a person is awake but can't move or communicate verbally due to paralysis. "We could actually learn how their brain looks when they think of water, food, or pain," said Cerf, allowing the patient to communicate with doctors.
Cerf, M. et al., "On-line, voluntary control of human temporal lobe neurons," Nature, 467:1104-8, 2010.
Read more: Directing attention via machine - The Scientist