Become a member of BioTechniques (it's free!) and receive the latest news in the life sciences and member-exclusives.

I can hear where you’re looking

Written by Beatrice Bowlby (Digital Editor)

Researchers have discovered that the subtle squeaking sounds in the ear generated by eye movements can indicate where the eyes are looking.  

In 2018, a team at Duke University (NC, USA) led by Jennifer Groh discovered that the ears make an imperceptible sound when the eyes move. 5 years later, the team has now found that these ear noises can indicate where your eyes are looking. It also works the other way; by knowing where the eyes are looking, the researchers could predict the waveform composition of the sound generated in the ear.

Groh believes that these subtle squeaking sounds in the ear may be caused by eye movements communicating with the brain to contract either the muscles or hair cells of the middle ear, which respectively dampen loud sounds or amplify quiet sounds. Although the purpose of these ear sounds isn’t entirely clear, Groh hypothesizes that they play a role in perception: “We think this is part of a system for allowing the brain to match up where sights and sounds are located, even though our eyes can move when our head and ears do not.”


Can smell impact visual perception?

Odors seem to impact our visual perception of colors, adjusting our perception of a ‘neutral’ color.


Discovering this link between the visual and auditory systems, the team wanted to investigate if these ear sounds provide detailed information about the eyes’ movements. To do this, they recruited 16 adults with unimpaired sight and hearing to complete an eye test, which consisted of following a green dot – with only their eyes – as it disappeared and reappeared in different locations on a screen, requiring participants to move their eyes horizontally, vertically and diagonally. While completing this task, the participants’ eye movements were tracked, and their ear sounds were recorded using earbuds embedded with microphones.

The team aligned these eye and ear recordings to determine which sounds corresponded to which eye movements, finding that each eye movement had a unique sound signature in the ear. This allowed researchers to decode the ear sounds and determine where people were looking based on the sound waves generated alone.

“Since a diagonal eye movement is just a horizontal component and vertical component, my lab mate and co-author David Murphy realized you can take those two components and guess what they would be if you put them together,” first author Stephanie Lovich commented. “Then you can go in the opposite direction and look at an oscillation to predict that someone was looking 30 degrees to the left.”

The next step for Groh’s team is to determine whether this link is important for perception and how it may be affected in individuals with hearing or vision loss. Groh is also investigating if individuals without hearing or vision loss generate ear signals that can predict how they would perform on a sound localization task that requires the mapping of auditory information onto a visual scene.