Abstract

An automatic bird identification system is required for offshore wind farms in Finland. Indubitably, a radar is the obvious choice to detect flying birds, but external information is required for actual identification. We applied visual camera images as external data. The proposed system for automatic bird identification consists of a radar, a motorized video head and a single-lens reflex camera with a telephoto lens. A convolutional neural network trained with a deep learning algorithm is applied to the image classification. We also propose a data augmentation method in which images are rotated and converted in accordance with the desired color temperatures. The final identification is based on a fusion of parameters provided by the radar and the predictions of the image classifier. The sensitivity of this proposed system, on a dataset containing 9312 manually taken original images resulting in 2.44 × 106 augmented data set, is 0.9463 as an image classifier. The area under receiver operating characteristic curve for two key bird species is 0.9993 (the White-tailed Eagle) and 0.9496 (The Lesser Black-backed Gull), respectively. We proposed a novel system for automatic bird identification as a real world application. We demonstrated that our data augmentation method is suitable for image classification problem and it significantly increases the performance of the classifier.

Highlights

  • Several offshore wind farms are under construction on the Finnish west coast

  • We proposed a novel system for automatic bird identification as a real world application

  • We demonstrated that our data augmentation method is suitable for image classification problem and it significantly increases the performance of the classifier

Read more

Summary

Introduction

Several offshore wind farms are under construction on the Finnish west coast. The official environmental specifications define that bird species behaviour at the vicinity of wind turbines must be monitored. The prototype system for automated bird identification is developed and placed at a test location on Finnish west coast The principle of the WT-Bird system is that a bird collision could be detected by the sound of the impact and that the bird species can be recognised by non-real time method from video footage [4,5]. It has known problems with false alarms in high wind circumstances concerning larger bird species and it has no automated species identification algorithm [6]. The images will be acquired automatically by the final system

Radar System
Video Head Control
Camera Control
Input Data
Data Augmentation
The Proposed System
Classification
Convolutional Neural Network
Hyperparameter Selection
Results
Discussion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call