Accurately monitoring a fishery—knowing how many and which fish species are being caught, and what misfortunate creatures are being dragged in as bycatch along the way—has never been easy. Around the world, the job of keeping tabs on fishers has typically fallen to people called fisheries observers who temporarily join a fishing crew to watch and record. There to take scientific observations and report any rule breaking, these independent monitors often have a difficult and dangerous job. Harassment and unsafe working conditions are common, and violence can be rife. Every year since 2009 at least one fisheries observer has gone missing at sea.
In other words, relying on onboard human observers is a notoriously imperfect way to regulate fishing activity. Slightly easier and cheaper, and decidedly much safer, is fitting fishing vessels with cameras that record their catches. But even video review is costly and time-consuming. A single fishing trip can generate hundreds of hours of video footage that someone still has to comb through to identify and count the animals flying past the camera.
For small-scale fishers, like those involving Indigenous communities on the west coast of Canada, video monitoring and review can be even more of a burden. Though monitoring is essential for managing fish stocks, the Canadian government’s fisheries monitoring system was designed with large, commercial, single-species fishers in mind, says Lauren Dean, a communications specialist for the Ha’oom Fisheries Society, an Indigenous organization representing five First Nations near Tofino, British Columbia.
Unlike regular commercial fishers, however, Ha’oom’s fishers often embark on relatively modest expeditions targeting several species at once. In the Canadian government’s view, each species targeted counts as a different commercial fishery with its own legal monitoring requirements. “Dealing with five or six different systems for monitoring is simply not viable economically,” Dean says.
In recent years, though, a solution has emerged to make fisheries monitoring more accessible, more efficient, and safer: artificial intelligence.
As machine learning, image recognition, and other forms of artificial intelligence grow ever more potent, a startup based in Vancouver, B.C., called OnDeck AI is developing a computerized fisheries monitoring system that could radically change the video review process for Ha’oom’s fishers and others. Though the system is still in the design and testing stages, OnDeck AI hopes to one day be able to automatically detect and count fish in video footage, streamlining the monitoring process. It has big implications for everything from fisheries management and conservation to Indigenous nations’ access to data about their waters and resources.
OnDeck AI is the brainchild of Alexander Dungate, who grew up recreationally catching prawns, crab, and sole with his family on the B.C. coast. In 2021, while studying computer science and biology at the University of British Columbia (UBC), he learned about the world of fisheries monitoring—including the hazardous, sometimes lethal situations faced by independent monitors. He also learned that the archive of fisheries catch footage that has already been amassed is so massive that fully analyzing all of it is nearly impossible. Across the border in the United States, for instance, the Pacific States Marine Fisheries Commission, which helps regulate fisheries in California, Oregon, Washington, Idaho, and Alaska, has a year-and-a-half worth of video piled up awaiting review.
Dungate reached out to friend and fellow UBC student Sepand Dyanatkar, who was studying machine learning, a branch of AI that focuses on developing algorithms that can learn to analyze data without explicit programming. Building on an AI concept called “master object tracking,” the pair developed a computer program capable of visually identifying an object—in this case a fish—as it moves through a video.
Previous efforts to use AI to audit fisheries catch footage have typically run into the same two types of challenges. One is the generalizability problem. Whether it’s turbulent weather; bad lighting; waves and spray splashing the camera; or the boat itself getting tossed around in a storm, the qualities of videos captured at sea are highly variable, making analysis difficult.
Problem two is the data problem. It takes a significant amount of time and effort to annotate the images and videos needed to train an AI model to pick an individual fish out of an endless barrage of footage, let alone recognize the species. What’s more, training a typical AI system to recognize all the rare species—which are often the most important given their ecological significance—is simply not possible, says Dungate, since there just isn’t enough footage to work from. For instance, if a killer whale accidentally gets caught in a fisher’s net, “we really need to be able to recognize that. But there’s maybe three photos on Earth of that happening,” Dungate says.
OnDeck AI sidesteps those issues by designing their system to bypass the need for labeled training data by giving the model the ability to recognize what it’s seeing across any setting, including with footage of varying quality captured by different types of cameras in changing weather. Dungate likens the difference between traditional AI models and OnDeck AI’s to the difference between memorizing the answers to a test and using more complex reasoning to figure out the answers. In other words, it’s being able to identify a fish based on its coloring, shape, and fin structure—even if the system has never seen it before.
That’s the long-term objective, at least. But for now, OnDeck AI’s model needs more priming. So last summer, the company trained its AI system on Ha’oom’s video footage. To do that, Jessica Edwards, a Ha’oom biologist, analyzed catch footage alongside the AI. If the model failed to recognize a particular fish in the video feed, Edwards would draw a box around the overlooked animal to teach the algorithm. The goal, Edwards says, is to get the AI system to do as good a job as a human—or better—but in less time. Even before getting to that point, the AI can help an auditor do their job faster by identifying when a fish appears in the footage, allowing them to skip forward to relevant segments of the video rather than having to go through the whole feed.
While OnDeck’s AI system still needs more tuning, scientists and engineers elsewhere are also trying to crack the AI fisheries observer problem. Some, like a team in Australia, are developing an AI system capable of distinguishing between 12 different species in video footage. Working with the Australian longline tuna and billfish fishers, their system picks the right species roughly 90 percent of the time.
Bubba Cook, the policy director for Sharks Pacific, a New Zealand-based NGO, believes AI-driven monitoring could dramatically increase how much of the world’s oceans are under some sort of surveillance.
Only a small fraction of the world’s fisheries—roughly two percent—are currently monitored by observers, meaning the vast majority of fishing activity, including the bycatch of protected species, happens without oversight.
Though human observers will still be necessary for things like tissue sampling, “the reality is,” says Cook, that “electronic monitoring coupled with AI review is the only way we’re going to get the level of observer coverage we need.”