Abstract

In the information theory, entropy is a measure of the uncertainty associated with a random variable. To identify the important variables of a system subjected to many random variables, a novel entropy-based importance measure for random variables is proposed in this paper. The method evaluates the effect of a random variable on the output responses by calculating the change of the entropy value. This technique focuses on the influence of input uncertainty on the entropy of the output responses and can be easily extended to the case taking the correlations among the random variables into consideration. Besides, the effect of a random variable within any partial region of interest can also be evaluated by the proposed global sensitivity indicator. The mathematical properties of the proposed importance measure are investigated and proven in detail. Three simple numerical examples are employed to demonstrate the applicability of the new importance measure, and then a probabilistic risk assessment model is used to demonstrate the engineering application of the importance measure.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call