Abstract

Condition-based maintenance (CBM) is a promising technique for a wide variety of deteriorating systems. Condition-based maintenance’s effectiveness largely depends on the quality of condition monitoring. The majority of CBM mathematical models consider perfect inspections, in which the system condition is assumed to be determined error-free. This article presents a mathematical model of CBM with imperfect condition monitoring conducted at discrete times. Mathematical expressions were derived for evaluating the probabilities of correct and incorrect decisions when monitoring the system condition at a scheduled time. Further, these probabilities were incorporated into the equation of the Shannon entropy. The problem of determining the optimal preventive maintenance threshold at each inspection time by the criterion of the minimum of Shannon entropy was formulated. For the first time, the article showed that Shannon’s entropy is a convex function of the preventive maintenance threshold for each moment of condition monitoring. It was also shown that the probabilities of correct and incorrect decisions depend on the time and parameters of the degradation model. Numerical calculations show that the proposed approach to determining the optimal preventive maintenance threshold can significantly reduce uncertainty when deciding on the condition of the monitoring object.

Highlights

  • The concept of “entropy” is widely used in various fields of science

  • We should note that the proposed approach to decision making at condition monitoring can be applied to deteriorating processes described by the model (32) and to many other monotonic stochastic processes such as the Gamma process, inverse Gaussian process, etc

  • In the Condition-based maintenance (CBM) models, it is wrong to assume that the probabilities of false-positive, true-positive, false-negative, and true-negative can be constants

Read more

Summary

Introduction

The concept of “entropy” is widely used in various fields of science. Clausius, introduced this concept in the early 1850s for highly specific thermodynamic purposes. He proved a theorem that states that the amount of heat received by the system in any circular process, divided by the absolute temperature at which it was received, is not positive. Between 1872 and 1875, introduced the concept of the entropy of a thermodynamic system that is defined as the product of Boltzmann’s constant and natural logarithm of the number of different microscopic states corresponding to a given macroscopic state. In 1948, proposed using the concept of entropy in information theory [1]. The Shannon formula calculates information binary entropy for independent random events with m possible states

Objectives
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call