According to a new studyAI can be used to track coral restoration by comparing the sounds of healthy and unhealthy reefs.
Sounds from a reef come from a variety of sources, including the water flowing over the reef and the animals that live there. An unhealthy reef would probably be less noisy because less stuff would be tucked away everywhere. So a team of researchers considered training a computer to distinguish the quality of a reef by its soundscape – something the human ear cannot easily detect.
By using the Mars Coral Reef Restoration Project in Indonesia, where reef spiders have been used to repopulate reefs destroyed by dynamite fishing, scientists deployed hydrophones at hourly intervals at four types of sites: healthy, degraded, recently restored and mature restored.
After feeding the recordings to a computer algorithm, the scientists determined that the computer could differentiate between acoustic patterns of healthy and degraded reefs. Traditionally, reef health is assessed by sending divers out to visually observe the degradation of a reef and the animals that live there. However, this method requires more energy and expense than suspending a hydrophone over a reef.
“This is a really exciting development. Sound recorders and AI could be used around the world to monitor the health of reefs and find out if attempts to protect and restore them are working,” he added. said co-author Dr. Tim Lamont.
Although this study demonstrated that sounds between healthy and unhealthy reefs are easily distinguished, identifying how the soundscape changes across these two parameters could help describe the progress of coral reef restoration.