DONATE

IAHR Document Library


« Back to Library Homepage « Proceedings of the 14th International Symposium on Ecohydrau...

Preliminary Application of Instance Segmentation for Monitoring Diurnal Activity of Freshwater Fish

Author(s): Wataru Ishikawa; Shinji Fukuda

Linked Author(s): Shinji Fukuda

Keywords: No keywords

Abstract: Advanced monitoring systems that integrate digital cameras and image processing technologies have increasingly been attracting attention of specialists and engineers in wide research disciplines. In this study, we aimed to develop an image processing pipeline to monitor small freshwater fish automatically and in day and night. For this purpose, we used instance segmentation as an image processing technique specifically for counting fish individuals in an image. We collected images of our target fish, Nipponocypris temminckii in a spring-fed stream, the Yagawa River in Tokyo. All images were taken during the daytime using digital cameras including smartphones. Training data were augmented by adjusting brightness and sharpness, reducing image size, and copy-paste augmentation, from which six datasets were created by additively combining these methods. Also, the test images were collected by time-lapse shooting of the target fish swimming in a series of laboratory aquarium experiments. Half of them were taken in the evening using an infrared camera. We trained a Mask R-CNN model on each dataset and applied it to the test images to verify the accuracy of fish detection and to compare the applicability of the data augmentation methods employed. Results showed that the combination of many data augmentation methods improved the accuracy of fish detection and allowed for fish detection with infrared images. In particular, we found that the copy-paste method improves the detection accuracy of infrared images, supporting the effective of this method for fish detection with infrared images taken under dark condition.

DOI:

Year: 2022

Copyright © 2024 International Association for Hydro-Environment Engineering and Research. All rights reserved. | Terms and Conditions