Abstract
To design effective and efficient fish passage facilities at hydropower plants, the knowledge of swim behaviour of fish is essential. Therefore, living wild fish were investigated at different fish guidance structures in an experimental flume in a test section of 11 m length and 2.5 m width at water depths of about 0.6 m. Besides analysis of time data and manual recordings of the fish behaviour, video recordings of the fish movements can allow more detailed analysis of fish behaviour in different hydraulic situations. Thus, a videometry system was installed consisting of eleven synchronous cameras with overlapping views lined-up under dry conditions outside the flume. A 3D tracking algorithm was developed and implemented to analyse the video data. Core of the code is a motion-based multiple object tracking method, in which several objects can be tracked in 2D pixel-frame coordinates at the same time. After undistorting and stereo-calibrating the cameras, the 2D tracks are transferred to a 3D metric-space according to their epipolar geometry. Within this paper video data from a single experimental run of 15 min with three fishes with lengths of 100–150 mm are analysed exemplarily. The path-time diagram gives a distinct ‘big picture’ of the fish movement, which helps to identify preferred and disliked regions. However, due to imperfect actual camera setup, a 3D view in the near field of the cameras and an automated separation of individual tracks in a group of fish remains challenging.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.