Making AI Smarter with an Artificial, Multisensory Integrated Neuron

Posted Today

Artificial neuron processes visual and tactile input together.

The feel of a cat’s fur can reveal some information, but seeing the feline provides critical details: Is it a housecat or a lion? While the sound of fire crackling may be ambiguous, its scent confirms the burning wood. Our senses synergize to give a comprehensive understanding, mainly when individual signals are subtle.

Researchers have developed a bio-inspired artificial neuron to process visual and tactile sensory inputs together. Image credit: Tyler Henderson/Penn State

The collective sum of biological inputs can be greater than their contributions. Robots tend to follow more straightforward addition, but Penn State researchers have now harnessed the biological concept for application in artificial intelligence to develop the first artificial, multisensory integrated neuron.

The team was led by Saptarshi Das and published its U.S. National Science Foundation-supported work in Nature Communications.
Co-authors, from left, Muhtasim Ul Karim Sadaf, graduate student in engineering science and mechanics; Saptarshi Das, associate professor of engineering science and mechanics; and Andrew Pannone, graduate student in engineering science and mechanics, stand together in Das’ laboratory. Not pictured: co-authors Najam U Sakib and Harikrishnan Ravichandran, both graduate students in engineering science and mechanics.

Co-authors, from left, Muhtasim Ul Karim Sadaf, graduate student in engineering science and mechanics; Saptarshi Das, associate professor of engineering science and mechanics; and Andrew Pannone, graduate student in engineering science and mechanics, stand together in Das’ laboratory. Not pictured: co-authors Najam U Sakib and Harikrishnan Ravichandran, both graduate students in engineering science and mechanics. Image credit: Tyler Henderson/Penn State.

“Robots make decisions based on their environment, but their sensors do not generally talk to each other,” said Das. “A collective decision can be made through a sensor processing unit, but is that the most efficient or effective method? In the human brain, one sense can influence another, allowing the person to judge a situation better.”

A car might have one sensor scanning for obstacles while another senses darkness to modulate the intensity of the headlights. Individually, these sensors relay information to a central unit that then instructs the car to brake or adjust the headlights.

According to Das, this process consumes more energy. Allowing sensors to communicate directly with each other can be more efficient in terms of energy and speed — particularly when the inputs from both are faint.
Researchers have developed a bio-inspired artificial neuron to process visual and tactile sensory inputs together.

Researchers have developed a bio-inspired artificial neuron to process visual and tactile sensory inputs together. Image credit: Tyler Henderson/Penn State

“Biology enables small organisms to thrive in environments with limited resources, minimizing energy consumption in the process,” said Das.

“The requirements for different sensors are based on the context — in a dark forest, you’d rely more on listening than seeing, but we don’t make decisions based on just one sense. We have a complete sense of our surroundings, and our decision making is based on the integration of what we’re seeing, hearing, touching, smelling, et cetera. The senses evolved together in biology, but separately in AI. In this work, we’re looking to combine sensors and mimic how our brains work.”

Das said that an artificial multisensory neuron system could enhance sensor technology’s efficiency, paving the way for more eco-friendly AI uses. As a result, robots, drones and self-driving vehicles could navigate their environment more effectively while using less energy.

Source: NSF

Leave a Reply

Your email address will not be published.