Abstract

This paper presents the results of an extensive testing campaign for validating the time series analysis approach to the estimation of the linear field performance loss rates (PLR) of grid-connected photovoltaic (PV) systems of different technologies operating side-by-side at the PV Technology test site of the University of Cyprus since June 2006. Fifteen-minute average measurements of the array power at the maximum power point, PA, were used to construct time series of the performance ratio, RP, of each array. The time series were analysed with regARIMA and classical seasonal decomposition (CSD) in order to extract the trend. Then, linear regression (LR) was used to calculate the slope. To validate the results, all arrays were disassembled and every module was tested at Standard Test Conditions (STC) in a class A+A+A+ solar simulator, in order to calculate the nominal array degradation rate. Comparison of both methods has shown good agreement between the time series analysis approach and the indoor testing approach, for PV arrays with no identified failures through electroluminescence (EL). On the contrary, for modules with identified failures through EL, the nominal array degradation rate was higher in comparison to the field PLR. Differences between the two methods have been shown to be due to cracked cells, hotspots and spectral response mismatch. Lastly, the comparison has shown that amongst the time series analysis methods, regARIMA produced statistically significant PLR with low uncertainty and the best agreement with the nominal array degradation rates.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call