Abstract

SCADA operating data are more and more used across the wind energy domain, both as a basis for power output prediction and turbine health status monitoring. Current industry practice to work with this data is by aggregating the signals at coarse resolution of typically 10-min averages, in order to reduce data transmission and storage costs. However, aggregation, i.e., downsampling, induces an inevitable loss of information and is one of the main causes of skepticism towards the use of SCADA operating data to model complex systems such as wind turbines. This research aims to quantify the amount of information that is lost due to this downsampling of SCADA operating data and characterize it with respect to the external factors that might influence it. The issue of information loss is framed by three key questions addressing effects on the local and global scale as well as the influence of external conditions. Moreover, recommendations both for wind farm operators and researchers are provided with the aim to improve the information content. We present a methodology to determine the ideal signal resolution that minimized storage footprint, while guaranteeing high quality of the signal. Data related to the wind, electrical signals, and temperatures of the gearbox resulted as the critical signals that are largely affected by an information loss upon aggregation and turned out to be best recorded and stored at high resolutions. All analyses were carried out using more than one year of 1 Hz SCADA data of onshore wind farm counting 12 turbines located in the UK.

Highlights

  • In modern wind turbines, a plethora of operating data are acquired with high temporal frequency [1,2] by a vast number of sensors [3,4]

  • This study has aimed to explore the information contained in high frequency SCADA data to determine characteristics and limitations of wind turbine SCADA data

  • The main goal of this contribution has been to quantify the information lost due to temporal aggregation of operating data, as this data is usually only available as 10-min averaged values

Read more

Summary

Introduction

A plethora of operating data are acquired with high temporal frequency [1,2] by a vast number of sensors [3,4]. The data are typically aggregated as 10-min average values, sometimes accompanied by the standard deviations or the maxima and minima measured in these intervals. This temporal aggregation of a signal, referred to as downsampling, saves a lot of space upon storage and reduces the bandwidth needed when transferring the data, both connected to cost savings. The consequences arising from this signal conversion depend strongly on the further use of the data Understanding these consequences, by knowing the properties of a signal after its transition to lower resolutions, will help to optimize both data storage and costs while at the same time providing the best possible signal quality for analytic investigations

Objectives
Methods
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call