Lag – the brain kind – visual versus auditory

I have started watching the new PBS 6 part series “The Brain with David Eagleman.”  I was telling housefly a little about it.  What really caught my attention was when they run a split screen comparison of sprinters starting a race.  On one screen, you see the runners start the race when they hear the sound of a pistol fired.  On the other screen, the runners start the race when they see a light flash on.  The two races start in slow motion, with the bang of the gun synced to the exact same moment as the flash of light.  In slow motion, it’s clear that the runners are able to process the sound way before the visual.  They start running way earlier from the bang of the gun.

David Eagleman discusses how so many different systems need to act in concert to process visual information, and how all that processing lags our reaction to visual stimulus.  Sound is more directly processed by the brain.  While there is still a small amount of lag from the natural wiring, we are able to react to aural information much faster.  I thought it was a really neat way to illustrate the importance of paying attention to sound in Quake 2.  When I’m stoned and listening to music playing Quake 2 CTF, I’m not going to shoot until I see you.  When my opponent is h0ps, he hears me coming around the corner and has clicked the fire button before I even see him.

Here is a relevant excerpt from David Eagleman writing for Edge.org, in 2009:

“It may be that a unified polysensory perception of the world has to wait for the slowest overall information. Given conduction times along limbs, this leads to the bizarre but testable suggestion that tall people may live further in the past than short people. The consequence of waiting for temporally spread signals is that perception becomes something like the airing of a live television show. Such shows are not truly live but are delayed by a small window of time, in case editing becomes necessary.

Waiting to collect all the information solves part of the temporal- binding problem, but not all of it. A second problem is this: if the brain collects information from different senses in different areas and at different speeds, how does it determine how the signals are supposed to line up with one another? To illustrate the problem, snap your fingers in front of your face. The sight of your fingers and the sound of the snap appear simultaneous. But it turns out that impression is laboriously constructed by your brain. After all, your hearing and your vision process information at different speeds. A gun is used to start sprinters, instead of a flash, because you can react faster to a bang than to a flash. This behavioral fact has been known since the 1880s and in recent decades has been corroborated by physiology: the cells in your auditory cortex can change their firing rate more quickly in response to a bang than your visual cortex cells can in response to a flash.

The story seems as though it should be wrapped up here. Yet when we go outside the realm of motor reactions and into the realm of perception (what you report you saw and heard), the plot thickens. When it comes to awareness, your brain goes through a good deal of trouble to perceptually synchronize incoming signals that were synchronized in the outside world. So a firing gun will seem to you to have banged and flashed at the same time. (At least when the gun is within thirty meters; past that, the different speeds of light and sound cause the signals to arrive too far apart to be synchronized.)

But given that the brain received the signals at different times, how can it know what was supposed to be simultaneous in the outside world? How does it know that a bang didn’t really happen before a flash? It has been shown that the brain constantly recalibrates its expectations about arrival times. And it does so by starting with a single, simple assumption: if it sends out a motor act (such as a clap of the hands), all the feedback should be assumed to be simultaneous and any delays should be adjusted until simultaneity is perceived. In other words, the best way to predict the expected relative timing of incoming signals is to interact with the world: each time you kick or touch or knock on something, your brain makes the assumption that the sound, sight, and touch are simultaneous.

While this is a normally adaptive mechanism, we have discovered a strange consequence of it: Imagine that every time you press a key, you cause a brief flash of light. Now imagine we sneakily inject a tiny delay (say, two hundred milliseconds) between your key-press and the subsequent flash. You may not even be aware of the small, extra delay. However, if we suddenly remove the delay, you will now believe that the flash occurredbefore your key-press, an illusory reversal of action and sensation. Your brain tells you this, of course, because it has adjusted to the timing of the delay.”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.