By Rich Haridy March 22, 2021
New research offers an early proof-of-concept demonstrating how ultrasound recording could be effectively used in future brain-machine interfaces.
Brain-machine interfaces are one of those incredible ideas that were once the reserve of science fiction. However, in recent years scientists have begun to experiment with primitive forms of the technology, even going as far as helping a quadriplegic control an exoskeleton using tiny electrode sensors implanted in his brain.
Perhaps the most well-known recent investigation into brain-machine interfaces has come from Elon Musk’s Neuralink, which is attempting to develop a tiny, easily implantable device that can instantly read and relay neural activity. While Neuralink is working to create a device that can be delivered into one’s brain easily, these kinds of brain-machine interfaces still fundamentally require some kind of device to be surgically implanted.
A new study led by researchers from Caltech is demonstrating a non-invasive brain-machine interface using functional ultrasound (fUS) technology. The landmark proof-of-concept study reveals an ultrasound technique recording brain activity in monkeys and then using that data to predict their subsequent motor movements.
The preliminary research utilized non-human primates to explore whether ultrasound recordings could be used to predict behavior. Mikhail Shapiro, one of the authors on the new study, says the first question the researchers were asking was whether high-resolution blood flow dynamics in the brain, as measured by ultrasound, could be associated with animal behavior.
“The answer is yes,” Shapiro says. “This technique produced detailed images of the dynamics of neural signals in our target region that could not be seen with other non-invasive techniques like fMRI. We produced a level of detail approaching electrophysiology, but with a far less invasive procedure.”
The neural activity shown in those brain images was decoded with a machine learning algorithm to correspond with movements. This process was shown to accurately predict movements even before they happened.S. Norman / Caltech
Focusing on activity in the posterior parietal cortex, a brain region known to co-ordinate motor movement, the researchers found they could effectively associate ultrasound readings with subsequent physical actions. A machine learning algorithm was then tasked with correlating the ultrasound data with the animals’ physical movements.
The results revealed the system could effectively predict whether an animal was about to move its eyes left or right with 78 percent accuracy and whether an animal was about to reach out to its left or right with 89 percent accuracy.
“We pushed the limits of ultrasound neuroimaging and were thrilled that it could predict movement,” explains Sumner Norman, co-first-author on the study. “What’s most exciting is that fUS is a young technique with huge potential – this is just our first step in bringing high performance, less invasive BMI to more people.”
Perhaps the most apparent limitation for the technology raised by this preliminary research is one of latency. The system that was tested needed around two seconds of data to predict the animals’ movements but the researchers suggest this delay could certainly be reduced in the future using a variety of technological improvements.
So now we can add functional ultrasound to the list of experimental brain-machine interface techniques being tested hopefully offering an option in the future for those keen on controlling machines with their mind but not so keen on having electrodes implanted in their brain.
The new study was published in the journal Neuron.
Source: Caltech
Leave a Reply