November 14, 2017
Web Feature

PNNL Develops EyeSea, a Machine-Learning Tool to Automate Video Analysis of MHK Sites

New tool aids in determining the environmental impact of tidal, wave, and small hydropower

EyeSea can "watch" underwater video footage and automatically identify when wildlife enters the frame.

Marine hydrokinetic (MHK) energy taps into the movement of waves, tides, and water currents to turn a turbine and produce electricity. Based on prior analysis of MHK resources, more than 10% of Pacific states' electricity demand could be satisfied by developing a fraction of the wave energy available off the west coast. Developing just a 1/6th of the available wave energy in the five Pacific states could power more than 5 million homes and support roughly 33,000 jobs.

So, what’s the hold up?

One thing impeding the growth of MHK is the potential impact on the environment. To get a better grasp of the effects and accelerate the deployment of MHK, researchers at PNNL are studying and developing new technologies to measure and evaluate the environmental performance of MHK.

PNNL scientists and engineers recently created a software tool—EyeSea—that automates the analysis of underwater video footage. A common and simple means of seeing how fish and mammals interact with MHK systems is to simply set up an underwater camera and begin recording. One hour of video, however, can take 5 or more hours to assess manually, and it’s not uncommon for there to be hundreds of hours of footage to review.

To mitigate the problem, operators and researchers often resort to sub-sampling, picking random one-hour intervals of footage to evaluate. While such an approach speeds up an analysis, it reduces accuracy.

Funded by DOE’s Water Power Technologies Office(Offsite link), EyeSea uses machine vision algorithms to “watch” video footage for any incidents where a fish or mammal is near an MHK turbine. The tool automatically detects when a fish or mammal enters the frame and flags every event. The flagged events tell an operator which segments of footage should be evaluated, significantly reducing labor time.

For video footage taken underwater, determining whether an object is an animal or not can be challenging and time consuming.

PNNL recently developed and tested the tool using footage from a pilot project of an MHK unit in Igiugig, Alaska by Ocean Renewable Power Company(Offsite link) (ORPC). For about two months, an ORPC turbine generated electricity during the middle of Alaska’s annual salmon run. Using data from the pilot project, PNNL staff developed EyeSea to analyze the underwater footage and detect when fish were around the turbine.

Researchers analyzed 43 hours of video footage, where they observed less than 20 fish interactions and no occasions of fish injuries. Equally important, PNNL assessed the accuracy of EyeSea and determined it was 85 percent accurate at detecting when wildlife was present. From this data, PNNL is refining the algorithms behind EyeSea. If successful, EyeSea will be made available to MHK operators and developers to streamline siting and permitting processes, and meet post-installation monitoring requirements at future MHK sites.

Key Capabilities

Published: November 14, 2017

PNNL Research Team

Genevra Harker-Klimes, Shari Matzner, and Garrett Staines