Abstract

Advanced computing performance and machine learning accuracy have pushed engineers and researchers to consider more and more complex mathematical models. Methods such as Deep Neural Networks have become increasingly ubiquitous. However, the problem of the interpretability of machine learning predictions in a decision process has been identified as a hot topic in several engineering fields, leading to confusion in various communities. This paper discusses a methodological framework of hybrid interpretability tools in neural network prediction for an engineering application. These tools analyze a decision’s consequences under different circumstances and situations. The aim is to reconcile the ML prediction accuracy and the interpretability for a global approach to making systems more flexible. In this study, the methods used to deal with the interpretability of neural network predictions have been treated from two perspectives: (i) model-specific as partial derivatives and (ii) model-agnostic methods. The latter tools could be used for any ML model prediction. In order to visualize and explain the inputs’ impacts on prediction results, Partial Dependence Plots (PDP), Individual Conditional Expectation (ICE), and Accumulated Local Effects (ALE) are used and compared. The prediction of the electrical power (PE) output of a combined cycle power plant has been chosen to demonstrate the feasibility of these methods under real operating conditions. The results show that the most influential input parameter among ambient temperature (AT), atmospheric pressure (AP)), vacuum (V), and relative humidity (RH) is AT. The visualization outputs allow us to identify the direction (positive or negative) and the form (linear, nonlinear, random, stepwise) of the relationship between the input variables and the model’s output. The results of the interpretation are coherent with the literature studies.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.