Marine Conservation Projects
With the extreme difficulty of exploring and monitoring critical ocean habitats, scientists have turned to unmanned underwater vehicles and to hydrophone arrays to better understand the numbers and behaviors of key animal species. Both of these types of systems generate huge amounts of data that is difficult and time-consuming to process manually. Artifical intelligence can help to dramically reduce this effort and improve conservation of key ocean species.
Image of Yelloweye Rockfish (Sebastes ruberrimus) taken by WDFW underwater robot.
Seaeye Falcon underwater robot used by WDFW in rockfish surveys.
Accelerating Rockfish Conservation
The Washington Department of Fish and Wildlife (WDFW) uses an underwater robot to conduct surveys of threatened rockfish species in the Salish Sea in Washington State to estimate population sizes. Robots are an effective, non-lethal, and minimally disruptive method of locating rockfish in their deep-water habitat. However, each hour of video captured by a robot requires about 16 hours of staff time to process and identify all fish in the video. The goal of this collaboration is to develop artificial intelligence capabilities to reduce the video annotation time by up to 50% and produce survey results more quickly, which would allow for faster response times to changing fish populations and better support and inform recovery efforts.
Success in this collaboration would enable other fisheries from California to Alaska to use the methods developed, since they also utilize underwater robots to collect video of rockfish species.
Collaboration Partner:
Monitoring Dolphin Populations by Eavesdropping on Individually Distinct Whistles
Many human activities alter the marine environment and can have a wide range of harmful impacts on nearby dolphin populations. Monitoring populations effectively is important for evaluating these impacts and is critical for ensuring sustainable development, particularly with the large-scale transition to renewable energy sources, such as offshore wind farms in the marine environment.
The EarthSenseAI Center has been working with scientists from the Woods Hole Oceanographic Institution (USA), Aarhus University (Denmark) and Cetacean Communication Research to develop AI algorithms that identify individual dolphins from the unique signature whistles that they produce. Automatic signature whistle classification using AI will enable remote monitoring of the movements and habitat use of individual dolphins.
This collaboration is also formulating how to extend the work to support detecting large numbers of individual dolphins in underwater acoustic arrays. This would enable cost-effective monitoring of the impacts of anthropogenic activities on dolphin populations.
Photo by the Sarasota Dolphin Research Program taken under NMFS Scientific Research Permit No. 522-1785.
Talk by Dr. Frants Jensen, Aarhus University, discussing this collaboration. The first 30 minutes covers dolphin social interactions, the importance of signature whistles in those interactions, and the data collection in Sarasota, Florida. The second part of the talk covers the deep learning aspect of the project.
Global Library of Underwater Biological Sounds (GLUBS)
Sound is a critically important sensing modality in underwater habitats. It is crucial for the survival of many underwater species, such as dolphins and other cetaceans who use it for communications and hunting. Sound can also be used to understand the health of threatened ecosystems, such as coral reefs. However, the origins of many underwater sounds have not been identified. In addition, underwater acoustic analysis today mainly relies on regional and taxa-specific repositories of sounds and a wide variety of different analysis tools. This is not sustainable given the rapid increase in underwater acoustic arrays and slows the integration of acoustic analysis into underwater conservation activities.
The Global Library of Underwater Biological Sounds (GLUBS) is an international partnership bringing together a wide variety of experts, including bioacousticians, bioinformaticians, ecologists, data scientists and technologists. They share the goals of integrating and expanding open libraries of underwater sound recordings, as well as developing and sharing tools for underwater acoustics analysis. The open sound libraries will enable training of machine learning models for detecting and classifying animal sounds.
The GLUBS mission, as described on their website:
Our mission is to build datasets of known and unknown sounds to create automatic call detectors for fish, mammals, and invertebrates that will be open access and user-friendly. GLUBS is committed to advancing the field of bioacoustics research and providing researchers with the tools they need to study the underwater world.
Additional information:
GLUBS is a UN Decade of Ocean Science for Sustainable Development endorsed project.
GLUBS is also a working group of the Scientific Committee on Oceanic Research (SCOR).
The need for GLUBS was initially described in a 2022 paper in Frontiers in Ecology and Evolution.
The EarthSenseAI Center contributes to both the cyber-infrastructure and artificial intelligence working groups of GLUBS.