May 10, 2024

Triton: Igiugig Fish Video Analysis - Project Report


This document summarizes the analysis of video data collected around the Ocean Renewable Power Company’s RivGen® device in the Kvichak River during July and August 2015. Human (manual) analysis of the data was undertaken and used to develop automated algorithms to detect fish in the frames and describe interaction behavior. In addition, a web application, EyeSea, was developed to combine manual and automated processing, so that ultimately the automated algorithms could be used to identify where human analysis was needed (i.e., when fish are present in video frames). The aim of the project was to develop algorithms that could identify video frames with fish present that could then be used to enable faster manual analysis. However, the project has also identified recommendations for future data collection, and research that could help analysis and interpretation. The manual analysis started to look at all data from the start of the deployment; this was to ensure rare events were seen, and initially focused on night-time data when more fish were present. This process highlighted the amount of time taken to identify fish, and ultimately only 42.33 hours of video were reviewed due to the time-consuming analysis. The data were classified as ‘Fish’, when the reviewer was confident it was a fish, and ‘Maybe’ fish, when it was difficult to distinguish. This was based on the movement, shape and color characteristics. ‘Fish’ were further classified as ‘adult’, ‘juvenile’, or ‘unidentifiable’. Behavioral attributes were noted, which were broadly divided into passive and avoidance activities. In over 42 hours data, there were only 20 potential contact interactions, of which 3 were ‘Maybe’ classifications, 12 were juveniles and 5 were adults; on only one occasion was an actual contact confirmed, and this was an adult fish with the camera, not the turbine itself. This also highlights the difficulties associated with confirming a strike or collision event as occurring or being a near-miss. More interactions were detected at night; this was probably biased by nighttime artificial light use, which may have attracted fish, but also could have increased detection probability as the light is reflected from the fish itself. For the algorithm development, background subtraction, optical flow, and deep learning techniques were considered. The deep learning approach was determined as needing too much training data for this application, and therefore was not continued. The optical flow analysis was considered promising, but did not give immediate results, so would need further investigation. Therefore, background subtraction was the main focus. Three methods of background subtraction were tried: Robust Principal Components Analysis (RPCA), Gaussian Mixture Model (GMM) and Video Background Extraction(ViBE). A classification technique was then applied to the foreground images to determine fish presence. Using this combination, it was found that fish could be accurately detected when occupying more pixels (>200 pixels, 98.2% correct; 100-200 pixels, 99.6% correct; 5-100 pixels, 85.4% correct; 2-5 pixels, 66.3% correct). In parallel, EyeSea was developed to convert the video data to a usable form and to enable manual and automated analysis of the data that would have a standardized output. Recommendations for further research, and standardized methods for data collection are given.

Published: May 10, 2024


Matzner S., C.K. Trostle, G.J. Staines, R.E. Hull, P. Avila, and G. Harker-Klimes. 2017. Triton: Igiugig Fish Video Analysis - Project Report Richland, WA: Pacific Northwest National Laboratory. doi:10.2172/1485061.