Equipment Reliability Modeling and Remaining Useful Life Prediction Based on Bayesian Small-Sample Methods
New equipment testing, constrained by environment and cost, yields limited reliability data, hindering accurate estimation. This paper proposes a Bayesian framework for time-varying reliability estimation and remaining life prediction of small-sample equipment. Using working cycles as the life index, it constructs residual strength of fatigue damage accumulation and derives the time-varying reliability function. To address poor parameter estimation with small samples, Bayesian inference fuses prior information with field test data. Normal and inverse gamma distributions are chosen as conjugate priors to derive posterior distributions of initial strength and working load parameters, plus maximum posterior estimates. High-dimensional posteriors are solved via Gibbs sampling in MCMC, with convergence verified. Empirical analysis shows the model outperforms traditional ones. Integrating prior and experimental data makes the posterior distribution more centralized than the prior, reducing parameter uncertainty.
- Research Article
14
- 10.18187/pjsor.v15i2.2174
- Jun 12, 2019
- Pakistan Journal of Statistics and Operation Research
In this article we present a Bayesian prediction of multiplicative seasonal autoregressive moving average (SARMA) processes using the Gibbs sampling algorithm. First, we estimate the unobserved errors using the nonlinear least squares (NLS) method to approximate the likelihood function. Second, we employ conjugate priors on the model parameters and initial values and assume the model errors are normally distributed to derive the conditional posterior and predictive distributions. In particular, we show that the conditional posterior distribution of the model parameters and the variance are multivariate normal and inverse gamma respectively, and the conditional predictive distribution of the future observations is a multivariate normal. Finally, we use these closed-form conditional posterior and predictive distributions to apply the Gibbs sampling algorithm to approximate empirically the marginal posterior and predictive distributions, enabling us easily to carry out multiple-step ahead predictions. We evaluate our proposed Bayesian method using simulation study and real-world time series datasets.
- Research Article
17
- 10.18187/pjsor.v13i3.1647
- Aug 31, 2017
- Pakistan Journal of Statistics and Operation Research
In this paper we use the Gibbs sampling algorithm to develop a Bayesian inference for multiplicative double seasonal moving average (DSMA) models. Assuming the model errors are normally distributed and using natural conjugate priors, we show that the conditional posterior distribution of the model parameters and variance are multivariate normal and inverse gamma respectively, and then we apply the Gibbs sampling to approximate empirically the marginal posterior distributions. The proposed Bayesian methodology is illustrated using simulation study.
- Research Article
13
- 10.1007/s13571-019-00192-z
- Apr 1, 2019
- Sankhya B
In this paper we use the Gibbs sampling algorithm to present a Bayesian analysis to multiplicative double seasonal autoregressive (DSAR) models, considering both estimation and prediction problems. Assuming the model errors are normally distributed and using natural conjugate and g priors on the initial values and model parameters, we show that the conditional posterior distributions of the model parameters and variance are multivariate normal and inverse gamma respectively, and the conditional predictive distribution of the future observations is a multivariate normal. Using these closed-form conditional posterior and predictive distributions, we apply the Gibbs sampling to approximate empirically the marginal posterior and predictive distributions, enabling us easily to carry out multiple-step ahead predictions. The proposed Bayesian method is evaluated using simulation study and real-world time series dataset.
- Research Article
17
- 10.5351/csam.2015.22.6.557
- Nov 30, 2015
- Communications for Statistical Applications and Methods
In this paper we develop a Bayesian inference for a multiplicative double seasonal autoregressive (DSAR) model by implementing a fast, easy and accurate Gibbs sampling algorithm. We apply the Gibbs sampling to approximate empirically the marginal posterior distributions after showing that the conditional posterior distribution of the model parameters and the variance are multivariate normal and inverse gamma, respectively. The proposed Bayesian methodology is illustrated using simulated examples and real-world time series data.
- Research Article
171
- 10.1016/j.engstruct.2015.08.005
- Aug 24, 2015
- Engineering Structures
Bayesian model updating of a coupled-slab system using field test data utilizing an enhanced Markov chain Monte Carlo simulation algorithm
- Book Chapter
- 10.1093/oso/9780198841296.003.0016
- May 23, 2019
This chapter introduces Markov Chain Monte Carlo (MCMC) with Gibbs sampling, revisiting the “Maple Syrup Problem” of Chapter 12, where the goal was to estimate the two parameters of a normal distribution, μ and σ. Chapter 12 used the normal-normal conjugate to derive the posterior distribution for the unknown parameter μ; the parameter σ was assumed to be known. This chapter uses MCMC with Gibbs sampling to estimate the joint posterior distribution of both μ and σ. Gibbs sampling is a special case of the Metropolis–Hastings algorithm. The chapter describes MCMC with Gibbs sampling step by step, which requires (1) computing the posterior distribution of a given parameter, conditional on the value of the other parameter, and (2) drawing a sample from the posterior distribution. In this chapter, Gibbs sampling makes use of the conjugate solutions to decompose the joint posterior distribution into full conditional distributions for each parameter.
- Research Article
99
- 10.2136/sssaj2002.1740
- Nov 1, 2002
- Soil Science Society of America Journal
Model nonlinearity and parameter interdependence violate the use of a first‐order approximation to obtain exact confidence intervals of parameters in soil hydrologic models. In this study, the posterior distribution of parameters in soil water retention and hydraulic conductivity functions is examined using observed water retention data and a laboratory transient multistep outflow experiment. Parameter uncertainties obtained with traditional first‐order approximations and uniform grid sampling strategies were compared with those obtained using the Metropolis algorithm, a Markov Chain Monte Carlo (MCMC) sampler. A diagnostic measure, based on multiple sequences generated in parallel, was used to test whether convergence of the Metropolis sampler to the posterior distribution had been achieved. Most significantly, as the Metropolis algorithm can cope with rough response surfaces generated by the objective function used, it not only successfully infers the multivariate posterior probability distribution of the model parameters, but also provides valuable insights in parameter interdependence in the full parameter space.
- Conference Article
1
- 10.1063/5.0059103
- Jan 1, 2021
- AIP conference proceedings
Quantile regression is a regression method that modelling a relationship between quantile of variable response and one or more variable predictors. Quantile regression has advantages that linear regression does not have; it is robust against outliers and can model heteroscedasticity data. The parameters of quantile regression can be estimated using the Bayesian method. The Bayesian method is a data analysis tool derived based on the Bayesian inference principle. Bayesian inference is the process of studying data analysis inductively with the Bayes theorem. To estimate regression parameters with Bayesian inference, it is necessary to find the posterior distribution of the regression parameters where the posterior distribution is proportional to the product of the prior distribution and its likelihood function. Since the calculation of the posterior distribution analytically is difficult to do if more parameters are estimated, the Markov Chain Monte Carlo (MCMC) method is proposed. The use of the Bayesian method in quantile regression has advantages, namely the use of MCMC has the advantages of obtaining sample parameter values from an unknown posterior distribution, using computationally efficient, and easy to implement. Yu and Moyeed (2001) introduced Bayesian quantile regression using the likelihood function of errors with an Asymmetric Laplace Distribution (ALD) and found that minimizing parameter estimates in quantile regression is the same as maximizing the likelihood function of errors with an Asymmetric Laplace Distribution (ALD). The method used to estimate quantile regression parameters is Gibbs sampling from the ALD, which is a combination of the exponential and normal distributions. To find the parameters of the regression model by sampling the posterior distribution found in this thesis. The results obtained from Gibbs sampling are a sample sequence of estimated parameters. After obtaining the sample sequences, the sample lines are averaged to obtain an estimated regression parameter. The case study in this thesis discusses the effect of risk factors from motor vehicle insurance customers on the size of claims submitted by customers.
- Research Article
4
- 10.1214/16-bjps329
- Feb 1, 2018
- Brazilian Journal of Probability and Statistics
In this paper, we analyze the effect on posterior parameter distributions of four possible alternative prior distributions, namely Normal-Inverse Gamma, Normal-Scaled Beta two, Student’s $t$-Inverse Gamma and Student’s $t$-Scaled Beta two. We show the effects of these prior distributions when there is apparently conflict between the sample information and the elicited hyperparameters. In particular, we show that there is not systematic differences of posterior parameter distributions associated with these four priors using data of piped water demand in a linear model with autoregressive errors. To test the hypothesis that this result is due to using a moderate sample size and a relatively high level of expert’s uncertainty, we perform some simulation exercises assuming smaller sample sizes and lower expert’s uncertainty. We obtain the general same pattern, although Student’s $t$ models are slightly less affected by prior information when there is a high level of expert’s certainty, and Scaled Beta two models exhibit a higher level of posterior dispersion of the variance parameter.
- Research Article
152
- 10.1016/j.jhydrol.2009.07.051
- Jul 23, 2009
- Journal of Hydrology
Assessing parameter, precipitation, and predictive uncertainty in a distributed hydrological model using sequential data assimilation with the particle filter
- Conference Article
64
- 10.1109/icassp.2014.6854186
- May 1, 2014
This paper presents a Bayesian fusion technique for multi-band images. The observed images are related to the high spectral and high spatial resolution image to be recovered through physical degradations, e.g., spatial and spectral blurring and/or subsampling defined by the sensor characteristics. The fusion problem is formulated within a Bayesian estimation framework. An appropriate prior distribution related to the linear mixing model for hyperspectral images is introduced. To compute Bayesian estimators of the scene of interest from its posterior distribution, a Gibbs sampling algorithm is proposed to generate samples asymptotically distributed according to the target distribution. To efficiently sample from this high-dimensional distribution, a Hamiltonian Monte Carlo step is introduced in this Gibbs sampler. The efficiency of the proposed fusion method is evaluated with respect to several state-of-the-art fusion techniques.
- Research Article
69
- 10.1109/tnnls.2020.2977132
- Jan 1, 2021
- IEEE Transactions on Neural Networks and Learning Systems
With the rapid development of sensor and information technology, now multisensor data relating to the system degradation process are readily available for condition monitoring and remaining useful life (RUL) prediction. The traditional data fusion and RUL prediction methods are either not flexible enough to capture the highly nonlinear relationship between the health condition and the multisensor data or have not fully utilized the past observations to capture the degradation trajectory. In this article, we propose a joint prognostic model (JPM), where Bayesian linear models are developed for multisensor data, and an artificial neural network is proposed to model the nonlinear relationship between the residual life, the model parameters of each sensor data, and the observation epoch. A Bayesian updating scheme is developed to calculate the posterior distributions of the model parameters of each sensor data, which are further used to estimate the posterior predictive distributions of the residual life. The effectiveness and advantages of the proposed JPM are demonstrated using the commercial modular aero-propulsion system simulation data set.
- Research Article
1
- 10.1080/03610926.2025.2505587
- May 27, 2025
- Communications in Statistics - Theory and Methods
Existing Bayesian estimation methods of time series with seasonal patterns are based on the normality assumption; however, real time series might violate this assumption. In this article, by assuming the scale-mixtures of normal (SMN) distribution for the model errors, we propose the Bayesian estimation of seasonal autoregressive (SAR) models via the Gibbs sampler and Metropolis-Hastings algorithms. This SMN class of distributions includes various symmetric heavy-tailed distributions as special cases, such as the Student’s t, slash, and contaminated normal distributions. In particular, we employ appropriate prior distributions for the SAR parameters, and accordingly we derive the full conditional posteriors of the SAR coefficients and scale parameter to be the multivariate normal and inverse gamma, respectively. In addition, we derive the conditional posteriors of the parameters related to the SMN distribution to be in closed forms. Using the derived closed-form conditional posterior distributions, we propose the Gibbs sampler with the Metropolis-Hastings algorithm to approximate empirically the marginal posterior distributions. We present an extensive simulation study and a real application, aiming to evaluate the accuracy of the proposed algorithm.
- Research Article
87
- 10.1186/1742-4682-3-42
- Dec 1, 2006
- Theoretical Biology and Medical Modelling
BackgroundTranslating a known metabolic network into a dynamic model requires reasonable guesses of all enzyme parameters. In Bayesian parameter estimation, model parameters are described by a posterior probability distribution, which scores the potential parameter sets, showing how well each of them agrees with the data and with the prior assumptions made.ResultsWe compute posterior distributions of kinetic parameters within a Bayesian framework, based on integration of kinetic, thermodynamic, metabolic, and proteomic data. The structure of the metabolic system (i.e., stoichiometries and enzyme regulation) needs to be known, and the reactions are modelled by convenience kinetics with thermodynamically independent parameters. The parameter posterior is computed in two separate steps: a first posterior summarises the available data on enzyme kinetic parameters; an improved second posterior is obtained by integrating metabolic fluxes, concentrations, and enzyme concentrations for one or more steady states. The data can be heterogenous, incomplete, and uncertain, and the posterior is approximated by a multivariate log-normal distribution. We apply the method to a model of the threonine synthesis pathway: the integration of metabolic data has little effect on the marginal posterior distributions of individual model parameters. Nevertheless, it leads to strong correlations between the parameters in the joint posterior distribution, which greatly improve the model predictions by the following Monte-Carlo simulations.ConclusionWe present a standardised method to translate metabolic networks into dynamic models. To determine the model parameters, evidence from various experimental data is combined and weighted using Bayesian parameter estimation. The resulting posterior parameter distribution describes a statistical ensemble of parameter sets; the parameter variances and correlations can account for missing knowledge, measurement uncertainties, or biological variability. The posterior distribution can be used to sample model instances and to obtain probabilistic statements about the model's dynamic behaviour.
- Research Article
208
- 10.1186/1297-9686-26-2-91
- Jan 1, 1994
- Genetics, Selection, Evolution : GSE
Summary - The Gibbs sampling is a Monte-Carlo procedure for generating random samples from joint distributions through sampling from and updating conditional distributions. Inferences about unknown parameters are made by: 1) computing directly summary statistics from the samples; or 2) estimating the marginal density of an unknown, and then obtaining summary statistics from the density. All conditional distributions needed to implement the Gibbs sampling in a univariate Gaussian mixed linear model are presented in scalar algebra, so no matrix inversion is needed in the computations. For location parameters, all conditional distributions are univariate normal, whereas those for variance components are scaled inverted chi-squares. The procedure was applied to solve a Gaussian animal model for litter size in the Gamito strain of Iberian pigs. Data were 1 213 records from 426 dams. The model had farrowing season (72 levels) and parity (4) as fixed effects; breeding values (597), permanent environmental effects (426) and residuals were random. In CASE I, variances were assumed known, with REML (restricted maximum likelihood) estimates used as true parameter values. Here, means and variances of the posterior distributions of all effects were obtained, by inversion, from the mixed model equations. These exact solutions were used to check the Monte-Carlo estimates given by Gibbs, using 120 000 samples. Linear regression slopes of true posterior means on Gibbs means were almost exactly 1 for fixed, additive genetic and permanent environmental effects. Regression slopes of true posterior variances on Gibbs variances were 1.00, 1.01 and 0.96, respectively. In CASE II, variances were treated as unknown, with a flat prior assigned to these. Posterior densities of selected location parameters, variance components, heritability and repeatability were estimated. Marginal posterior distributions of dispersion parameters were skewed, save the residual variance; the means, modes and medians of these distributions differed from the REML estimates, as expected from theory. The conclusions are: 1) the Gibbs sampler converged to the true posterior distributions, as suggested by CASE I; 2) it provides a richer description of uncertainty about genetic