Abstract

An uncertain parameter in a linear system is represented as a random variable. Sensitivity of the system is measured in terms of the difference between its output when the parameter has its nominal value, and its output when the parameter has a randomly chosen value. More specifically, the measure is the mean value of the integral squared difference. Given a system described by a linear time-invariant differential equation and an assumed parameter distribution, this measure can be evaluated directly, in closed form. This direct approach is not possible when the system includes a time delay. The latter case could be solved by a Monte Carlo evaluation of the expectation. An alternative computational scheme using orthonormal functions is proposed. This method requires approximate solution of an infinite set of differential equations by solving an associated finite set. Walsh functions are convenient when the parameter distribution is of the truncated variety, which is natural for an uncertain parameter, and they lead to loosely coupled equations which are amenable to the necessary truncation. The significant features of the method are displayed by a simplified example, without time delay, in order to compare this method with the exact evaluation and with a Monte Carlo evaluation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call