Our senior communications manager wearing an EEG cap and virtual reality headset. Photo credits: Magdalena Solyga/FMI
November 20, 2025
Inside the tunnel: Experiencing the brain’s response to sensory mismatches
In this first-person account, FMI’s senior communications manager describes taking part in an early human trial that adapts previous experiments in mice to explore how the human brain responds when visual and auditory information suddenly fall out of sync.
When I arrived in the room where seminars are usually held, I barely recognized it. All the chairs had been cleared away to create space. At the front stood a desk with a computer and a trolley with a virtual reality headset and various pieces of equipment.
Magdalena Solyga — a postdoctoral researcher with Georg Keller — was sitting at the desk, waiting for me. She greeted me warmly and took a few minutes to explain what we were about to do and the science behind it. The experiment, she told me, is part of an effort to understand how the brain responds when its expectations about the world are suddenly violated.
Previous work from the Keller group had shown that when mice ran through a virtual tunnel and the visual scene suddenly froze, their brains produced a strong signal in the visual cortex — a kind of neural “surprise” when the world didn’t behave as expected.
Further experiments revealed that the same thing happens in the auditory cortex when sound is disrupted, and that combining mismatches in both sight and sound leads to an even stronger response. Now, the team is adapting this experiment for humans, using virtual reality headsets and brain activity recordings to measure how our neurons respond to similar sensory surprises.
After signing the consent form, I sat down while Solyga fitted to my head an EEG cap — a network of sensors that would record my brain’s electrical activity. Over that came the virtual reality headset, and within moments, the seminar room around me was replaced by a long, oval-shaped tunnel with walls patterned with black and white stripes.
The task was simple: move forward at a steady pace. As I did, the striped walls flowed past smoothly, and a sound played from a speaker positioned at one end of the tunnel. The closer I moved toward it, the louder it became. Then, without warning, the stripes froze and the sound cut out. My legs kept moving, but I could feel a sudden mismatch between what my body was doing and what my senses told me. After half a second, the tunnel and the noise returned. This sequence repeated several times.
As I walked through the virtual tunnel, Solyga tracked both my neural and behavioural reactions to these surprise events. The EEG cap captured what she calls the brain’s “mismatch response,” while the virtual reality headset’s built-in accelerometers monitored changes in my walking speed when the environment suddenly stopped.
In control trials, I first walked through the same tunnel without any sound, and then again in the dark with only the noise from the speaker. These trials help the researchers isolate how much of the brain’s activity was driven by visual information, by sound, or by the combination of both.
In the coming months, Solyga plans to collect data from dozens of other participants to investigate how interactions between visual and auditory mismatch responses vary across individuals. These early human trials are an important step in understanding how the brain integrates information from different senses to build a coherent picture of the world, she says.
One of the long-term goals of this research is to develop reliable brain-based biomarkers for psychiatric conditions. If people with psychosis show abnormal or absent mismatch responses, their brain signals may be used to help diagnosis or track the effects of treatments, offering a more objective measure than current self-reported symptoms.
Our senior communications manager wearing an EEG cap and virtual reality headset. Photo credits: Magdalena Solyga/FMI


