A new study authored by Marc Schieber, M.D., Ph.D., and Kevin Mazurek, Ph.D. with the University of Rochester Medical Center Department of Neurology and the Del Monte Institute for Neuroscience, which appears today in the journal Neuron, shows that very low levels of electrical stimulation delivered directly to an area of the brain responsible for motor function can instruct an appropriate response or action, essentially replacing the signals we would normally receive from the parts of the brain that process what we hear, see, and feel.
“The analogy is what happens when we approach a red light,” said Schieber. “The light itself does not cause us to step on the brake, rather our brain has been trained to process this visual cue and send signals to another parts of the brain that control movement. In this study, what we describe is akin to replacing the red light with an electrical stimulation which the brain has learned to associate with the need to take an action that stops the car.”
The findings could have significant implications for the development of brain-computer interfaces and neuro-prosthetics, which would allow a person to control a prosthetic device by tapping into the electrical activity of their brain.
To be effective, these technologies must not only receive output from the brain but also deliver input. For example, can a mechanical arm tell the user that the object they are holding is hot or cold? However, delivering this information to the part of the brain responsible for processing sensory inputs does not work if this part of the brain is injured or the connections between it and the motor cortex are lost. In these instances, some form of input needs to be generated that replaces the signals that combine sensory perception with motor control and the brain needs to “learn” what these new signals mean.
“Researchers have been interested primarily in stimulating the primary sensory cortices to input information into the brain,” said Schieber. “What we have shown in this study is that you don’t have to be in a sensory-receiving area in order for the subject to have an experience they can identify.”
A similar approach is employed with cochlear implants for hearing loss which translate sounds into electrical stimulation of the inner ear and, over time, the brain learns to interpret these inputs as sound.
In the new study, the researchers detail a set of experiments in which monkeys were trained to perform a task when presented with a visual cue, either turning, pushing, or pulling specific objects when prompted by different lights. While this occurred, the animals simultaneously received a very mild electrical stimulus called a micro-stimulation in different areas of the premotor cortex – the part of the brain that initiates movement – depending upon the task and light combination.
The researchers then replicated the experiments, but this time omitted the visual cue of the lights and instead only delivered the micro-stimulation. The animals were able to successfully identify and perform the tasks they had learned to associate with the different electrical inputs. When the pairing of micro-stimulation with a particular action was reshuffled, the animals were able to adjust, indicating that the association between stimulation and a specific movement was learned and not fixed.
“Most work on the development of inputs to the brain for use with brain-computer interfaces has focused primarily on the sensory areas of the brain,” said Mazurek. “In this study, we show you can expand the neural real estate that can be targeted for therapies. This could be very important for people who have lost function in areas of their brain due to stroke, injury, or other diseases. We can potentially bypass the damaged part of the brain where connections have been lost and deliver information to an intact part of the brain.”
The study was support with funding from the National Institutes of Neurological Disorders and Stroke.
Leave a Reply