Disability

Some Blind People Can Echolocate, And New Research Could Help More Learn

Bats, dolphins, and many other animals use echolocation, a system of clicks or chirps that bounce off of the objects around them to help them navigate the world. It turns out that humans can do the same thing, and it's an awfully handy skill for those who are blind. Even still, there aren't many blind people who can echolocate, but if researchers can better understand the technique, more people might be able to use it.

Advertisement
Daniel Kish

An Image Of Sound

If you've ever heard of a blind person who can echolocate, chances are it was Daniel Kish. He's done a TED talk, was a guest on Invisibilia, and has been featured in multiple media outlets, all in the effort to help more blind people learn to echolocate like he can.

Kish has been blind almost from birth, and since he was a young child he has clicked his tongue and used the echoes to navigate through his environment. But even though he uses sound to navigate, he experiences the world in something a lot like images. "It's not computational," he told New Scientist. "There's a real palpable experience of the image as a spatial representation – here are walls, here are the corners, here is the presence of objects."

Scientists have been trying to understand more about how human echolocation works since at least 2011, when Kish and Brian Bushway, one of Kish's first students and now an instructor for his organization World Access For The Blind, consulted with Ontario, Canada researchers for a study published in PLOS One. Led by Dr. Mel Goodale, the researchers gave two other blind echolocators hearing tests and brain scans to see what exactly was going on when they performed their skill. The subjects had no better hearing than your average sighted person, but the brain scans showed something remarkable: the visual parts of their brains responded to the clicks in the same way it might respond to images, just like Kish said. The same wasn't true of sighted people. In 2015, Goodale announced that expert echolocators could even sense texture, weight, density, and composition of an object.

Diving Deeper

In 2017, Durham University's Lore Thaler and her team worked with Kish and others to dive even deeper into human echolocation. For a first-of-its-kind study published in PLOS Computational Biology, the researchers performed acoustic analysis of echolocators' mouth clicks to see just how they worked. While typical speech usually comes out in sound waves spread over 120 to 180 degrees and in the 300–3,000 Hz range, they found that the mouth clicks were higher and more tightly focused, emitting in a small 60 degree cone over a range of 2,000–4,000 Hz. That focus might make it easier for echolocators to "point" their clicks at the object they're sensing.

All that research made it possible for the team to generate synthetic clicks that emulated human echolocation. From there, they can alter the clicks in frequency and repetition to determine how to click best in certain situations. That could make it easier to teach more blind people to echolocate and to achieve more freedom as a result.

Love getting smarter? Sign up for our newsletter to learn something new every day!

Daniel Kish's Echolocation In Action

Share the knowledge!
Advertisement