Abstract

An interval observer has been shown to be a suitable passive robust strategy to generate an adaptive threshold to be used in residual evaluation when model uncertainty is located in parameters (interval model). In such an approach, the observer gain plays an important role since it determines the minimum detectable fault for a given type of fault and allows enhancing the observer fault detection properties. The aim of this paper is to analyze the influence of the observer gain on the time evolution of the residual sensitivity to a fault. Thereby, as a result of this sensitivity study, the minimum detectable fault time evolution for a given type of fault and the interval observer fault detection performance could be determined. In particular, three types of faults according to their detectability time evolution are introduced: permanently ( strongly) detected, non-permanently ( weakly) detected or just non-detected. An example based on a mineral grinding-classification process is used to illustrate the results derived.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.