Our eyes are never at rest. Instead, they remain in motion, even between our voluntary gaze shifts, through fixational eye movements—small, continuous movements of the eye that we are not aware of making.
Scientists have long sought to understand how we humans can perceive the world as stable as our eyes are constantly moving.
Past research has suggested that, in the intervals between voluntary gaze shifts, the human visual system builds a picture of a stable world by relying solely on sensory inputs from fixational eye movements. According to new research by a team at the University of Rochester, however, another contributing factor may exist.
In a paper published in Nature Communications, the researchers—including Michele Rucci, a professor in the Department of Brain and Cognitive Sciences and at the University’s Center for Visual Science, and first author Zhetuo Zhao, a PhD student in Rucci’s lab—report that the visual system not only receives sensory inputs from fixational eye movements but also possesses knowledge of the motor behavior involved in those movements.
“The human brain has a very precise knowledge of how the eyes move, even if humans are not aware of moving them, and they use this knowledge to infer spatial relations and perceive the world not as blurry but as stable,” Rucci says.
The results of the research reveal that spatial representations—that is, the locations of objects in relation to other objects—are based on a combination of sensory and motor activity from both voluntary and involuntary eye movements, which is contrary to the prevailing understanding, Rucci explains.
“It was already clear that the visual system uses sensory and motor knowledge from large voluntary movements, either gaze shifts we perform to look at different parts of a scene, or tracking movements for following moving objects,” he says. “But scientists didn’t think smaller, involuntary movements like fixational eye movements could be used to convey information through motor signals.”
Instead, the research shows the visual system continually monitors motor activity, even when people believe they are maintaining a steady gaze. The research also shows that vision has computational strategies similar to other senses, such as touch and smell, where motor behavior profoundly affects incoming sensory signals.
The results have important implications in future visual perception studies and will help better understand visual impairments that involve abnormal eye movements.
“Our study unveils that involuntary eye movements, which are widely discarded as motor noise, make major contributions to spatial representations of the world,” Zhao says. “As we show, studying spatial representations without considering motor activity—as is often done in current neuroscience—is severely limiting.”
Source: University of Rochester
Leave a Reply