Bats are everywhere – almost – and have been found to be particularly sensitive to the effects of climate change, so researchers have enlisted the aid of citizen scientists in what is called the Bat Detective project. The aim is to study bat populations and learn about the general health of the environment in which they reside. Healthy bat populations suggests healthy biodiversity.
Chiroptera – bats – comprise the second-largest order of mammals, with about 1200, or 20%, of all classified species (behind Rodentia, the rodents). The Arctic, Antarctic and a few isolated oceanic islands are the only places on Earth where bats are not found.
Most bat species use biological sonar, or echolocation, to navigate and to hunt for food. Detecting these sounds in audio recordings can help scientists monitor changes in bat populations, but it can be difficult to reliably detect bat calls in noisy, real-world conditions.
Using data collected by ordinary people, researchers led by Oisin Mac Aodha, from the California Institute of Technology, better known as Caltech, in Pasadena, US, have developed new, open-source programs to automatically detect bat echolocation calls in audio recordings. They presented their approach in the journal PLOS Computational Biology.
The team included participants from Australia, Romania and Bulgaria, and from Britain’s University College London, where Mac Aodha was originally based. They initially found that to effectively assess bat population trends they needed accurate, reliable, open-source tools for detecting and classifying bat calls in large collections of audio recordings. They noted, however, that existing applications were commercial or focused on species classification, neglecting the key problem of first localising echolocation calls in audio.
They set about building new bat-call detection systems based on recent developments in machine learning. They wrote a process that relies on supervised learning with deep convolutional neural networks (CNNs), allowing computers to learn directly from audio data to automatically and reliably detect bat calls and filter out background noise.
According to the online tutorial, A beginner’s guide to understanding convolutional neural networks, which describes CNNs in terms of image recognition, these systems are inspired by how our brains function.
When a computer takes an image or sound as input, it will perceive an array of values. We want the computer to differentiate between all the sounds or images it’s given and figure out their unique features. It’s a process that goes on in our own minds subconsciously.
In a similar way, the computer is able perform classifications by looking for distinctive features and then building up to more abstract concepts through a series of mathematical operations.
CNNs hadn’t previously been applied to bat monitoring, mainly because not enough human-analysed data was available to train them. To amass enough information, Mac Aodha and colleagues turned to thousands of citizen scientists who collected and annotated audio recordings as participants in the Indicator Bats Program (iBats) and Bat Detective project.
The researchers tapped into the online Zooniverse citizen science platform to enable public users to participate.
Between October 2012 and September 2016, almost 3000 users contributed and listened to 127,451 unique clips and made 605,907 annotations; 14,339 of these clips were labelled as containing a bat call, with 10,272 identified as containing search-phase echolocation calls.
“Our method enables us to automatically estimate bat activity from multi-year, large-scale audio monitoring programs with very little manual effort, enabling us to scale up monitoring efforts globally,” Mac Aodha says.
The algorithm is currently deployed as part of a real-time bat monitoring project at the Queen Elizabeth Olympic Park in east London. The team is now working to develop systems that can extract more detailed information from audio recordings, such as the presence of specific bat species.