Artificial intelligence

Expanding citizen science with computer vision for fish monitoring | MIT News

Each spring, herring migrate from Massachusetts’ coastal waters to begin their annual journey up rivers and streams to freshwater. River herring have experienced significant population declines over the past few decades, and their migrations are increasingly being monitored throughout the region, mostly through regular visual counts and volunteer-based programs.

Monitoring fish movements and understanding population dynamics is critical to informing conservation efforts and supporting fisheries management. As the year of the herring continues this month, researchers and resource managers once again face the challenge of counting and estimating the number of migratory fish as accurately as possible.

A team of researchers from the Woodwell Climate Research Center, MIT Sea Grant, MIT Computer Science and Artificial Intelligence Lab (CSAIL), MIT Lincoln Laboratory, and Intuit tested a new monitoring method using underwater video and computer vision to complement citizen science efforts. The researchers — Zhongqi Chen and Linda Deegan of Woodwell Climate Research Center, Robert Vincent and Kevin Bennett of MIT Sea Grant, Sara Beery and Timm Haucke of MIT CSAIL, Austin Powell of Intuit, and Lydia Zuehsow of MIT Lincoln Laboratory — published a paper describing the work in the journal. Remote Sensing in Ecology and Conservation in February.

The open access paper, “From images to continuous measurements: Expanding citizen science with computer vision for fish monitoring,” describes how recent advances in computer vision and deep learning, from object detection and tracking to species classification, offer promising real-world solutions for automated fish counting with improved performance and data quality.

Traditional monitoring methods are limited by time, environmental conditions, and labor pressure. The observed numbers of volunteers are limited to short sampling windows during the day, loss of movement at night and short pulses of migration, when hundreds of fish pass within a period of a few minutes. Although technologies such as acoustic monitoring and imaging sonar have advanced to continuous fish monitoring under certain conditions, the most promising and cost-effective option – manual review of underwater video – is still laborious and time-consuming. With the increasing demand for automatic video processing solutions, this study presents a reliable, cost-effective, and efficient deep learning-based system for automatic fish monitoring.

The team built an end-to-end pipeline – from in-field underwater cameras to video labeling and model training – to achieve automated, computer vision-powered fish counts. The videos were collected from three rivers in Massachusetts: the Coonamessett River in Falmouth, the Ipswich River (Ipswich), and the Santuit River in Mashpee.

To prepare the training dataset, the team selected video clips with variations in brightness, water clarity, fish species and density, time of day, and season to ensure that the computer vision model would work reliably across a variety of real-world conditions. They used an open source web platform to manually label the videos frame by frame with bounding boxes to track fish movements. In total, they labeled 1,435 video clips and defined 59,850 frames.

The researchers compared and validated the computer vision calculations with human video reviews, side-view calculations, and data from a passive integrated transponder (PIT) tag. They concluded that models trained on a variety of multi-site and multi-year data performed very well and produced seasonal statistics, with high resolution consistent with traditionally established estimates. Further, the system provided information on migration behavior, timing, and movement patterns linked to environmental factors. Using 2024 video of Coonamesset River migration, the program counted 42,510 river herring and revealed that upstream migration was greatest in the morning, while downstream migration occurred at night, with the fish using dark, quiet periods to avoid predators.

Through this real-world application, researchers aim to improve computer vision in fisheries management and provide a framework and best practices for integrating technology into efforts to conserve aquatic species. “MIT Sea Grant has been supporting work on this topic for a long time, and this excellent work by Zhongqi Chen and colleagues will improve fisheries monitoring skills and improve fish population assessment for fisheries managers and conservation groups,” said Vincent. “It will provide education and training for students, the public, and citizen science groups in supporting the environment and important animal cultures in our coastal areas.”

However, continued traditional monitoring is essential to maintain consistency in the long-term dataset until fisheries management organizations fully implement automated census systems. However, the perspective of computing and citizen science should be seen as complementary. Volunteers will be needed to maintain the camera and contribute directly to the computer vision workflow, from video annotation to model validation. Researchers think that combining citizens’ observations and data generated by computer vision will help create a comprehensive and comprehensive approach to environmental monitoring.

This work is funded by the MIT Sea Grant, with additional support provided by the Northeast Climate Adaptation Science Center, the MIT Abdul Latif Jameel Water and Food Systems seed grant, the AI ​​and Biodiversity Change Global Center (supported by the National Science Foundation and the Natural Sciences and Engineering Research Council of Canada), and the MIT Undergraduate Research Opportunities Program.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button