Your eyes and ears talk with each other, study

Scientists discovered a fascinating connection between eyes and ears. Scientists can now tell where their eyes are focused simply by listening to someone.

The movement of eyes, the target eyes are looking at can be identified from recordings made with a microphone in the ear canal.

In 2018, scientists discovered that ears make a subtle, imperceptible noise when the eyes move. In a new study, scientists from Duke University show these sounds can reveal where your eyes are looking. The scientists also found that they could predict the ear sound’s waveform by knowing where someone was looking.

This two-way relationship between eye movements and ear sounds could have implications for understanding perception and developing new clinical tests for hearing.

green dot on the screen
Participants tracked a green dot on the screen while researchers listened to the sounds made in their ear canals using microphone-embedded earbuds. Credit: Meredith Schmehl/Duke University

According to scientists, these sounds may be caused when eye movements stimulate the brain to contract either middle ear muscles, which typically help dampen loud sounds or the hair cells that help amplify quiet sounds. The exact purpose of these ear squeaks is unclear, but it might help sharpen people’s perception.

Jennifer Groh, Ph.D., senior author of the new report, said, “We think this is part of a system for allowing the brain to match up where sights and sounds are located, even though our eyes can move when our head and ears do not. Understanding the relationship between subtle ear sounds and vision might lead to the development of new clinical tests for hearing.”

Stephanie Lovich, one of the lead authors of the paper and a graduate student in psychology & neuroscience at Duke, said, “If each part of the ear contributes individual rules for the eardrum signal, then they could be used as a type of clinical tool to assess which part of the anatomy in the ear is malfunctioning.”

Following their initial finding, the research team examined if the weak audio signals held precise information about the eye movements in their most recent investigation. 

Scientists recruited 16 persons with normal vision and hearing to come to Groh’s lab in Durham to undergo a relatively straightforward eye exam to decode people’s ear noises.

After focusing their attention on a stationary green dot on a computer screen, participants tracked it with their eyes as it vanished and reappeared in the following directions: up, down, left, right, or diagonally from the initial position. As a result, Groh’s team analyzed a broad variety of auditory signals produced by horizontal, vertical, and diagonal eye movements.

An eye tracker recorded the participant’s pupil movements about the recorded ear sounds, which came from a pair of earphones with a built-in microphone.

After examining the ear noises, the research team discovered distinct signatures for various movement directions. Because of this, they could decipher the ear sound’s coding and determine where people were looking just by examining a soundwave.

Stephanie Lovich, one of the lead authors of the paper and a graduate student in psychology & neuroscience at Duke, said“Since a diagonal eye movement is just a horizontal component and vertical component, my labmate and co-author David Murphy realized you could take those two components and guess what they would be if you put them together. Then you can go in the opposite direction and look at an oscillation to predict that someone was looking 30 degrees to the left.”

Scientists are now looking forward to examining whether this ear sounds play a role in perception. They also test whether people who don’t have hearing or vision loss will generate ear signals that can predict how well they do on a sound localization task, like spotting where an ambulance is while driving, which relies on mapping auditory information onto a visual scene.

Journal Reference:

  1. Stephanie N. Lovich, Cynthia King, David Murphy, Jennifer Groh. Parametric information about eye movements is sent to the ears. Proceedings of the National Academy of Science. DOI: 10.1073/pnas.2303562120

Note: This article have been indexed to our site. We do not claim legitimacy, ownership or copyright of any of the content above. To see the article at original source Click Here

Related Posts
Engineers assessing hurricane-damaged insulation before Artemis launch Wednesday thumbnail

Engineers assessing hurricane-damaged insulation before Artemis launch Wednesday

STORY WRITTEN FOR CBS NEWS & USED WITH PERMISSION NASA’s Artemis 1 moon rocket and Orion spacecraft on Launch Complex 39B. Credit: NASA/Joel KowskyNASA managers cleared the agency’s leak-bedeviled Artemis moon rocket for the start of another countdown early Monday, but engineers must resolve questions about hurricane-damaged insulation before the huge booster can be cleared for blastoff
Read More
What Florence Nightingale Can Teach Us about Architecture and Health thumbnail

What Florence Nightingale Can Teach Us about Architecture and Health

In the late 19th century, Florence Nightingale revolutionized hospital design in what became known as Nightingale wards. The signature innovation of these wards was large windows that allowed cross-ventilation and abundant natural light. Nightingale believed that the light and air quality in a hospital's environment play an important role in speeding patient recovery. In the…
Read More
New Species of Titanosaur Unearthed in China thumbnail

New Species of Titanosaur Unearthed in China

Paleontologists in China have found fossil fragments from a new genus and species of titanosaurian sauropod dinosaur that walked the Earth during the Cretaceous period. Life reconstruction of Jiangxititan ganzhouensis. Image credit: UnexpectedDinoLesson / Sci.News. Jiangxititan ganzhouensis lived in what is now China between 72 and 66 million years ago (Late Cretaceous epoch). The dinosaur’s
Read More
Index Of News
Total
0
Share