Become a member of BioTechniques (it's free!) and receive the latest news in the life sciences and member-exclusives.

Is it a bird?… Is it a plane?…

Written by Jenny Straiton (Assistant Editor)

How past experiences interact with sensory input to infer speed and allow eye tracking moving objects.

Eye tracking moving objects.

As a plane flies across the sky, the muscles that allow the eye to track its movement can set pace not only on the speed observed but also the speed they would expect from seeing planes before.

New research from Duke University (NC, USA), published in Nature Neuroscience, has discovered the underlying circuit for this behavior and how the brain uses statistical inference about motion; showing how visual, sensory information works with prior experiences to predict and alter motor control and eye tracking movements.

Many papers over the last few years have shown that the brain can use inference and past experiences to predict the outcome of sensory inputs. However, “to our knowledge this is the first time anyone has a found a place perfectly situated to cause the behavioral outputs we see,” commented Stephen Lisberger, senior author of the paper, referring to the frontal eye fields (FEFSEM), the region that is causally involved in smooth-pursuit eye movements.

By recording the activity of single neurons in the FEFSEM of monkeys, the researchers could track synaptic changes and the associated movements. The memory of past experience is established by the strength of the synapses in the neural circuit that drives what are known as pursuit eye movements, and synaptic plasticity allows for modification of connections as experiences change.

When replicated in a computerized neural network, it can be shown that the activity and educated guesses can mimic Bayesian statistical inference patterns. These patterns rely on the probability of a given outcome, updating as more information is added.

Their experiment showed that the importance of this statistical estimation becomes more important when the visual target is less visually distinct: “It’s like when you’re driving a familiar road at night or in the fog – you go by what you know is going to happen next as much as by what you can actually see,” Lisberger said.

“In low light, FEFSEMsays, ‘don’t track that,’” Lisberger continued. “But if it’s a bright patch, it says ‘tune in, get it.’” This insight would enable them to predict eye movements by looking at just the neurons, he concluded.