Scientists and researchers are exploring the use of new technology to understand the natural world of sound in greater detail and on a larger scale than ever before.
With the help of ubiquitous microphones and machine learning models, scientists are able to listen to sounds that are otherwise inaudible to the human ear, allowing them to eavesdrop on an astonishing array of “conversations” among bats, whales, honeybees, elephants, plants, and coral reefs.
This technology is being used to explore specific, newly created data sets, such as satellite imagery, genome sequencing, quantum sensing, or bio-acoustic recordings, and extend the frontiers of human knowledge.
One area of particular interest is the use of machine learning to translate and replicate animal sounds, creating a kind of Google Translate for the zoo. This could potentially lead to interspecies communication in the next two decades, as humans use machines to translate and replicate animal sounds.
This sonic revolution has been triggered by advances in both hardware and software. Cheap, durable, long-lasting microphones and sensors can be attached to trees in the Amazon, rocks in the Arctic, or to dolphins’ backs, enabling real-time monitoring.
That stream of bioacoustic data is then processed by machine learning algorithms, which can detect patterns in infrasonic (low frequency) or ultrasonic (high frequency) natural sounds, inaudible to the human ear.
However, it’s important to note that this data only makes sense when combined with human observations about natural behaviors gained from painstaking fieldwork by biologists or crowdsourced analysis by amateurs.
With the help of citizen science initiatives like Zooniverse, which can mobilize more than 1 million volunteers, machine learning models are able to gather a wide range of data and training sets.
Overall, this new technology has the potential to open up new frontiers of understanding and communication with the natural world, but it will only be possible with a combination of machine learning, human observations, and citizen science.