How ML helps protect vital marine ecosystems
Our machine learning team helped a civil agency reduce full motion video analysis time by 99%. And made it possible to protect more vital coral reef than ever before.

As they lay quietly along tropical coastlines and islands, coral reefs are doing important work. Coral reefs provide a rich ecosystem – one of the densest on the planet. Up to 25% of all fish species rely on them for a part of their lifecycle. And coastal human communities rely on them, too. Coral reefs provide food security for more than 500 million people in 100 countries. Countless families rely on them as a means of income. And they are a trillion-dollar economic asset (Hoegh-Guldberg, 2015).
That’s just a few of the reasons protecting coral reefs is so vital. But increasing ocean temperatures, pollution, and acidity all threaten fragile coral ecosystems (Burke et al., 2011; Hughes et al., 2017). And so, tracking and monitoring coral health has become an important civil and humanitarian goal.
But it’s also a tough problem – one that’s costly and time consuming to solve. Organizations that aim to find, map, and monitor these reefs have been doing so with brute-force manual labor. Analysts watch videos taken of the ocean floor frame-by-frame, meticulously characterizing the contents of each frame. What percentage is coral? Algae? How many fish are in the scene? Marine specialists call this the habitat topology.
One such organization turned to Redpoint AI for help.
They needed a solution that would:
- Help analysts spend less time characterizing imagery. Each 90- 120-minute video has 30 frames-per-second. Even with a top analyst working at peak speed, on average, that adds up to an insurmountable 45 hours per video.
- Reduce error rates. Human tagging is tedious and error prone. Fatigue and poor image quality can lead to mischaracterization. That means important reefs or habitats could be mislabeled or missed – and accidentally excluded from protective action.
- Make better use of the volumes of available video. The collection of ocean floor videography far exceeded what was possible for their team to analyze. Those videos may contain information needed to find and monitor coral reefs. But if the videos aren’t analyzed, then it’s possible coral reefs go uncharacterized.
To solve these problems, Redpoint AI created a process that incorporates multiple machine learning algorithms. We designed spatial and spectral classifiers to process each video frame. The classifier uses spectral characteristics to identify material composition, and spatial information to perform object detection.
With this ML classifier system, the organization can:
- Save hours of analyst time. The automated ML analysis requires little tagging and requires only seconds per frame. That’s an improvement of two orders of magnitude above baseline. Analysts can now focus on those tasks only humans can accomplish.
- Reduce errors. The robust AI/ML algorithm improves accurate characterization of benthic biological coverage and other features of the ocean floor.
- Exploit available data. The algorithm can characterize one frame in a few seconds using ordinary computational hardware, decreasing the time to exploit a 120-minute video by greater than 99%. That means the organization can analyze over two hours of video in the time it used to an analyst to characterize just one minute.
“Before we deployed this algorithm, a lot of data was underutilized,” says President of Redpoint AI, Dr. Jeff Clark. “And now they can characterize and map so much more of the ocean floor.”
That means that more coral reefs can be found, tracked, and monitored – an important first step in protecting the health of these vital ecosystems.
For more information on full motion video (FMV) ML classification, email us at hello@redpoint-ai.com.

