Abstract

In case of model uncertainty is located in parameters (interval model), an interval observer has been shown to be a suitable strategy to generate an adaptive threshold to be used in residual evaluation. In interval observer-based fault detection methods, the observer gain plays an important role since it determines the minimum detectable fault for a given type of fault and allows enhancing the observer fault detection properties while diminishing model computational drawbacks (i.e. wrapping effect, computational complexity). In this paper, the effect of the observer gain on the time evolution of the residual sensitivity to a fault is analyzed. Then, using these sensitivity studies, the minimum detectable fault time evolution is established. Thus, three types of faults according their detectability time evolution are introduced: permanently (strongly) detected, non-permanently (weakly) detected or just non-detected. Finally, an example based on a mineral grinding- classification process will be used to illustrate the results derived.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.