by University of Toronto
Credit: Pixabay/CC0 Public Domain
An image of a beautiful beach conjures up certain sensations—one can imagine the warmth of the sun, and the sound of waves breaking on shore. But how does the human brain produce these impressions when an individual isn’t actually standing on a beach, basking in the sun’s rays or listening to the sound of the waves?
Scientists at the University of Toronto exploring this mystery found that the brain’s prefrontal cortex—a region known primarily for its role in regulating behaviour, impulse inhibition and cognitive flexibility—produces such sensations based on information provided by various senses. Their findings provide new insights into the poorly understood role of the prefrontal cortex in human perception.
Using a combination of photographs, sounds and even heated massage stones, the researchers investigated patterns of neural activity in the prefrontal cortex as well as the other regions of the brain known to be responsible for processing stimulation from all the senses and found significant similarities.
“Whether an individual was directly exposed to warmth, for example, or simply looking at a picture of a sunny scene, we saw the same pattern of neural activity in the prefrontal cortex,” said Dirk Bernhardt-Walther, an associate professor in the department of psychology in the Faculty of Arts & Science, and coauthor of a study published last week in the Journal of Neuroscience describing the findings. “The results suggest that the prefrontal cortex generalizes perceptual experiences that originate from different senses.”
To understand how the human brain processes the torrent of information from the environment, researchers often study the senses in isolation, with much prior work focused on the visual system. Bernhardt-Walther says that while such work is illuminating and important, it is equally important to find out how the brain integrates information from the different senses, and how it uses the information in a task-directed manner. “Understanding the basics of these capabilities provides the foundation for research of disorders of perception,” he said.
Using functional magnetic resonance imaging (fMRI) technology to capture brain activity, the researchers conducted two experiments with the same participants, based on knowing how regions of the brain respond differently depending on the intensity of stimulation.
In the first, the participants viewed a series of images of various scenes—including beaches, city streets, forests and train stations—and were asked to judge if the scenes were warm or cold and noisy or quiet. Throughout, neural activity across several regions of the brain was tracked.
In the second experiment, participants were first handed a series of massage stones that were either heated to 45C or cooled to 9C, and later exposed to sounds both quiet and noisy—such as birds, people and waves at a beach.
“When we compared the patterns of activity in the prefrontal cortex, we could determine temperature both from the stone experiment and from the experiment with pictures as the neural activity patterns for temperature were so consistent between the two experiments,” said lead author of the study Yaelan Jung, who recently completed her Ph.D. at U of T working with Bernhardt-Walther and is now a postdoctoral researcher at Emory University.
“We could successfully determine whether a participant was holding a warm or a cold stone from patterns of brain activity in the somatosensory cortex, which is the part of the brain that receives and processes sensory information from the entire body—while brain activity in the visual cortex told us if they were looking at an image of a warm or cold scene.”
The patterns were so compatible that a decoder trained on prefrontal brain activity from the stone experiment was able to predict the temperature of a scene depicted in an image as it was viewed.
“It tells us about the relationship between someone feeling warmth by looking at a picture versus actually touching a warm object,” Jung said.
Similarly, the researchers could decode noisy versus quiet sounds from the brain’s auditory cortex and pictures of noisy versus quiet scenes from the visual cortex.
“Overall, the neural activity patterns in the prefrontal cortex produced by participants viewing the images were the same as those triggered by actual experience of temperature and noise level,” said Jung.
The researchers suggest the findings may open a new avenue to study how the brain manages to process and represent complex real-world attributes that span multiple senses, even without directly experiencing them.
“In understanding how the human brain integrates information from different senses into higher-level concepts, we may be able to pinpoint the causes of specific inabilities to recognize particular kinds of objects or concepts,” said Bernhardt-Walther.
“Our results might help people with limitations in one sensory modality to compensate with another and reach the same or very similar conceptual representations in their prefrontal cortex, which is essential for making decisions about their environment.”
Leave a Reply