The Soundscapes of Restored Reefs
Researchers are using passive acoustic monitoring and artificial intelligence to measure the success of our coral reef restoration program in South Sulawesi, Indonesia.
The Sound of Recovery
Coral reefs are noisy places. On a thriving reef you’ll hear popping, grunting, whooping and other bizarre sounds produced by the diverse fish community. If you listen closely, you may also hear the background crackling of a snapping shrimp, like a campfire. But what can these sounds tell us? Author of a recent research project investigating the soundscapes of restored reefs, Ben Williams explains how we can use these sounds and an artificial intelligence model to measure the success of coral reef restoration...
In 2021, we kickstarted our initial research in Spermonde, Indonesia by exploring the use of passive acoustic monitoring for measuring the success of the Mars coral reef restoration program. By listening to snapshot one-minute recordings of our restored sites, we found a much higher diversity of fish sounds compared to nearby degraded reefs. However, manually listening to our recordings in search of these fish sounds was a slow and labour-intensive process. To scale up this approach to more sites and restoration projects around the world we were going to need a way to automate this process.
Listen to the soundscape of a thriving reef recorded near the Mars restoration project.
Therefore, in our recent publication, we turned to artificial intelligence (AI) to see if it could do the listening for us. To make this happen, we first trained an AI on recordings from nearby healthy and degraded reefs so that it could learn the difference between their soundscapes. We then tested its performance on new recordings it hadn’t heard previously. Through trial and error, we were eventually able to produce an AI that could correctly identify which kind of reef these new recordings came from almost 92% of time.
With this exciting new tool, we were able to apply this to recordings from restored sites. First up was a younger site where the MARRS restoration intervention had begun nine months previously. Our AI found that most of the snapshot recordings taken on this reef still sounded degraded. However, we also had recordings from two of the more mature restored reefs, where the restoration process had begun over two years previously. This time our AI classified almost all the recordings taken on each of these reefs as sounding healthy.
The AI found that immature restored reefs, similar to the one pictured left, still sounded similar to a degraded reef. However, mature sites, like the one pictured right, had returned to a fully healthy soundtrack.
Given the success we had in Spermonde, the next step is now to take this approach global! We’ll be taking recordings from multiple MARRS sites around the world to determine whether we see this recovery in the soundscape universally and whether AI can help track this. This time we’ll be using our new HydroMoth recorders, which we recently developed with help from the Mars team who tested these out in Spermonde. We’ve also been using deep learning, the cutting edge of today's AI, to get the very most out these fascinating datasets.