In humans, vision is the major channel for receiving information about the world. The same applies to the majority of animals, even those that are nocturnal and have to rely more heavily on the inputs from other senses. There are, however, exceptions: some mammals use echolocation to construct a picture of their surroundings, effectively using it instead of vision. How effective is this method of gathering information, and can it substitute vision successfully?
Echolocation is used by several kinds of animals for navigation in various environments. Whales, dolphins, and bats emit calls (high-frequency sounds) and then listen to the echoes returned from different objects surrounding them. The distance to objects can be estimated based on the time delay between the production of the call (click) and detection of the echo. Since the sound travels in the air with the speed of 340 m/s, a delay of 2 milliseconds, for instance, would mean that the object/target is about 34 cm away. In addition, sounds travel faster in water than in the air, making the clicking signals produced by whales of shorter duration than the signals produced by bats. Some echolocation signals produced by dolphins and sperm whales are even audible to humans.
Echolocation and its importance in the animal kingdom have been widely studied. Nature provides remarkable examples of how efficient echolocation can be. Bats easily detect tiny insects several meters away in complete darkness. Some species of bats in China and South America can fish in the darkness using echolocation, detecting ripples on the water surface that are indicative of the presence of fish under the surface. Sperm whales use echolocation to find and catch prey, mostly giant squids, deep in the ocean. These whales can dive to the depth of well over 2 kilometers and navigate through underwater canyons where their prey lives.
Although echolocation can be very efficient, it is not particularly common in the animal kingdom and was, in fact, developed independently by several groups of evolutionarily unrelated species. This is markedly different from vision, which is present in the majority of animals, with mechanisms perfected over millions of years of uninterrupted evolution.
Echolocation and Vision in Bats
If echolocation delivers essentially the same information as vision, can echolocation rely on the brain processes related to the processing of visual information?
To answer this question, scientists have investigated brain mechanisms underlying the processing of echo signals that allow animals to map objects in term of distance and direction.
Recently, an interesting study was conducted in bats with the aim of revealing what is happening in their brain while they fly through a room filled with obstacles (i.e., acoustically reflective plastic cylinders hanging from the ceiling). In order to determine the mechanisms underlying the bats’ navigation around these obstacles, researchers performed simultaneous chronological neural recordings.
The results show that the bats adjusted their flight and sonar behavior in order to respond to the echoes coming from the objects in the room. The objects’ positions were changing across the recording sessions, and bats started their flight from different start points to make sure that they didn’t rely on spatial memory from previous sessions and used only echo feedback to navigate.
The most important finding was the identification of the brain region that helped the animals to locate the objects in their environment. Echolocation signals were processed in the superior colliculus, a structure located in the midbrain. The superior colliculus consists of several layers that respond to different kinds of stimuli. Deeper layers of the superior colliculus are known to be involved in the processing of visual information. Thus, it seems that echolocation may indeed help animals to obtain a picture of their environment that is as authentic as the picture received through visual channels.
Echolocation and Vision in Humans
According to scientists, echolocation is not a phenomenon completely alien to humans. It seems that some blind individuals can be trained in echolocation.
Using this technique, they can locate objects by generating mouth clicks and listening to their echoes. The returned echoes can provide them with important information such as position, distance to and even the size or shape of the objects.
Several studies were conducted in order to determine the underlying neural mechanisms involved in human echolocation. One study investigated two individuals skilled in echolocation, one early and one late blind. The authors of the study measured brain activity in both participants while they were listening to their own echo sounds. They compared brain activity with clicks that produced echoes with the brain activity of control sounds that did not result in echoes.
It turned out that the processing of echo sounds activates brain regions that are typically associated with vision rather than hearing. More specifically, echo signals were processed in the visual cortex (rather than in superior colliculus like in bats and other echolocating animals). However, the processing of visual information in humans is centered around the visual cortex rather than the superior colliculus, as the human visual cortex has significantly expanded compared to most animals.
Thus, in both animals and humans, the information received through echolocation is processed in those regions that are also predominantly responsible for the processing of visual information. The curious examples of human echolocation are a perfect illustration of the plasticity of our brain and its ability to adapt to changing circumstances (blindness in this case).
A recent publication on echolocation in humans reviewed the applications of this phenomenon as well as the processes occurring in the brain of echolocation experts (i.e., individuals that are skilled in echolocation). They have reported that echolocation may enable blind people to sense small variations in the location, size, and shape of objects, or even to distinguish different materials the objects are made of, simply by listening to the echoes of their own mouth clicks.
It seems that echolocation may be perfected by blind individuals to facilitate the handling of daily tasks and achieving a higher degree of independence. Based on neuroimaging studies, the review confirmed that the processing of input signals from echoes activates the visual cortex, a brain part that would normally support vision in the sighted brain.
Watwood, S.L., Miller, P.J., Johnson, M., Madsen, P.T., Tyack, P.L. (2006). Deep-diving foraging behaviour of sperm whales (Physeter macrocephalus). Journal of Animal Ecology. 75(3): 814-825. DOI: 10.1111/j.1365-2656.2006.01101.x
Kothari, N.B., Wohlgemuth, M.J., Moss, C.F. (2018). Dynamic representation of 3D auditory space in the midbrain of the free-flying echolocating bat. Elife. pii: e29053 doi: 10.7554/eLife.29053.
Thaler, L., Arnott, S.R., Goodale, M.A. (2011). Neural correlates of natural human echolocation in early and late blind echolocation experts. PLoS One. 6(5): e20162. doi: 10.1371/journal.pone.0020162.
Thaler, L., Goodale, M.A. (2016). Echolocation in humans: an overview. Wiley interdisciplinary reviews. Cognitive science. 7(6): 382-393. doi: 10.1002/wcs.1408.
Image via Sweetaholic/Pixabay.
Source: Brain Blogger