Articles published on Statistical process monitoring
Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
442 Search results
Sort by Recency
- Research Article
- 10.1142/s0218539325500627
- Dec 23, 2025
- International Journal of Reliability, Quality and Safety Engineering
- Ali Salmasnia + 2 more
Triple concepts, including statistical process monitoring, inventory control, and maintenance, are integrated to achieve more realistic results for imperfect processes. Previous research has developed integrated models in two contexts: (1) single-product systems with multiple assignable causes, and (2) multi-product systems with a single assignable cause. This study aims to optimize an integrated model for an imperfect production process in a multi-product system, considering the occurrence of multiple assignable causes. The model also accounts for process stoppages due to system failures. Constraints are imposed to prevent inventory deficits for each product and idle time within each production cycle, ensuring that all products are manufactured at least once per cycle. An X-bar control chart is employed to monitor the process and detect out-of-control states. Given the complexity of the model, a particle swarm optimization algorithm is used to determine the decision variables, which include cycle time, sample size, sampling interval, and control limit coefficients. The model's applicability is demonstrated through an industrial case study. Additionally, comparative analyses are conducted, and a sensitivity analysis is performed to evaluate the effects of key parameters on the optimal results.
- Research Article
- 10.1007/s40194-025-02280-3
- Dec 22, 2025
- Welding in the World
- Giulio Mattera + 3 more
Abstract Vision-based monitoring of Wire Arc Additive Manufacturing (WAAM) using supervised deep learning represents the state of the art in anomaly detection, but such approaches require large labeled datasets that are costly to obtain and typically limited to laboratory conditions. To address these limitations, this work proposes a hybrid deep learning–statistical process monitoring (SPM) framework tailored to the stochastic nature of conventional arc welding processes such as GMAW-based additive manufacturing, where existing methods often overfit. The framework integrates a residual convolutional autoencoder (Res-CAE) with skip connections, which jointly analyzes video frames to generate refined latent-space features that are subsequently monitored using a Bayesian-optimized T 2 control chart. A novel quality index is further introduced to track component quality by accounting for the number, frequency, and severity of anomalies, moving beyond the conventional role of process interruption. Compared with state-of-the-art reconstruction-based anomaly detection methods, the proposed approach improves the F₂-score from 73.4% to 82.7%, achieves superior latent-space representation, and provides an interpretable quality metric with quantified uncertainty, offering a comprehensive solution for image-based monitoring of arc welding processes. This improvement is particularly relevant in short-circuit operating modes, where the strong process variability leads reconstruction-based deep models to overgeneralise and unintentionally reconstruct anomalous patterns, whereas the proposed SPM-guided representation preserves meaningful deviations and yields a more discriminative latent space.
- Research Article
- 10.1371/journal.pone.0337707
- Dec 12, 2025
- PLOS One
- Zahra Jalilibal + 2 more
Monitoring process variability utilizing profile data remains a significant challenge in statistical process monitoring (SPM), especially in the context of multichannel profiles. Detecting shifts in the covariance matrix of a multivariate normal process is crucial for this purpose. The complexity increases notably in high-dimensional processes because of the large number of variables and limited sample sizes. Typically, monitoring changes in the covariance matrix assumes that only a few elements deviate simultaneously from their in-control values. This study introduces a new approach for monitoring the covariance matrix in Phase II for multichannel data. The suggested approach incorporates exponentially weighted moving average (EWMA) control chart with multichannel functional principal components analysis (MFPCA) to derive proposed statistics. Simulation results represent the effectiveness and performance of the suggested approach, highlighting its superior performance in average run length under various shift patterns.
- Research Article
- 10.7717/peerj-cs.3426
- Dec 9, 2025
- PeerJ Computer Science
- Shaohua Zhao + 3 more
Centrifugal compressors are widely used in the oil and natural gas industry for gas compression, reinjection, and transportation, To address the challenges of the difficult separation of data anomalies from equipment failures and limited knowledge acquisition from expert knowledge bases, this article proposes a dynamic fault diagnosis method for centrifugal compressor expert systems, combining convolutional neural networks (CNN) and principal component analysis for statistical process monitoring (PCA-SPE). Realize the combination of expert knowledge and instrument data, and break through the weak links in existing petrochemical instrument safety monitoring technology and traditional expert systems. The method has been validated using process data from centrifugal compressors. The results demonstrate that the method achieved 100% classification accuracy for two types of faults: non-starting of the drive machine and excessively low oil pressure. Combined with the expert system, it reached a satisfactory diagnostic performance.
- Research Article
- 10.1080/03610926.2025.2594055
- Nov 22, 2025
- Communications in Statistics - Theory and Methods
- Marzieh Eshraghi + 2 more
In the past two decades, various new methods have emerged in the scope of statistical process monitoring, among which synthetic-type control charts play a significant role. Quality characteristics are universally defined by the relationship between a response variable and one or more independent variables, known as a profile, which requires continuous monitoring over time. However, there has been limited research on utilizing synthetic control charts for profile monitoring, despite the widespread use of simple and multiple linear profiles in practice. This research focuses on designing synthetic control charts to monitor various parameters, involving the intercept, slopes, and standard deviation of residuals, for Phase II monitoring simple linear profiles and multiple linear profiles. We introduce two new control charts: the synthetic-modified T2 control chart and the synthetic MEWMA control chart. We provide an algorithm for developing and optimally designing the proposed synthetic charts. Their performance is extensively evaluated through Monte Carlo simulations under different shifts in zero-state and steady-state scenarios. The results demonstrate that the synthetic control charts often outperform their competitors. Also, a comparative analysis is provided to assess the two proposed synthetic control charts. The application of the proposed control charts is also illustrated in the leather industry.
- Research Article
- 10.1080/00207543.2025.2584720
- Nov 12, 2025
- International Journal of Production Research
- Suying Zhang + 3 more
Due to the stochastic nature of textured surfaces, in-situ quality monitoring for texture-related defects using statistical process monitoring (SPM) is important yet challenging in academic research and industrial applications. This article presents an in-situ EWMA monitoring scheme based on the likelihood ratio test to quantify and detect unexpected global shifts in textured surfaces. We employ Gradient Boosting Regression Trees to implicitly characterise the joint distribution of textile image pixels and for the texture modelling. With the limited number of Phase I samples, the proposed scheme with a data-driven control limit algorithm can estimate the distribution of the charting statistics via the kernel density estimation (KDE) method, continuously update the probability limits at each time point during the monitoring phase and implement the dynamic monitoring to identify defective surfaces with a relatively satisfying in-control monitoring performance. The simulated stochastic experiments confirm the advantage of the proposed method. Also, a real layerwise images monitoring case based on Fused Deposition Modeling from Additive Manufacturing (AM) is provided.
- Research Article
- 10.3390/pr13113538
- Nov 4, 2025
- Processes
- Jingzhi Rao + 3 more
With the increasing demands for process safety and manufacturing efficiency, process monitoring has garnered significant attention from both academia and industry over the past few decades. Process monitoring aims to detect deviations from normal operating conditions by analyzing data features extracted under predefined normal states. However, the inherent non-stationarity of real industrial processes can compromise the accurate definition of these normal conditions, thereby limiting the effectiveness of traditional multivariate statistical process monitoring (MSPM) methods. A common strategy to address non-stationarity is to employ projection matrices that transform non-stationary time series into stationary ones, upon which monitoring statistics are constructed. Nevertheless, this approach often overlooks the valuable information contained in the non-stationary subspace, leading to insufficient extraction of fault-relevant features. Fault signatures may manifest in both stationary and non-stationary components of the process data. To overcome these limitations, an integrated monitoring framework that combines Stationary Subspace Analysis (SSA), a Stacked Autoencoder (SAE), and Support Vector Data Description (SVDD) is proposed in this research. Specifically, SSA was first applied to decompose the process data into stationary and non-stationary subspaces. Monitoring statistics were then constructed directly in the stationary subspace, while reconstruction errors from the SAE were used to capture features in the non-stationary subspace. Finally, SVDD was used to fuse the dual-space statistical indicators, enabling comprehensive fault detection. The proposed method was validated by the Tennessee Eastman and real industrial processes. Comparative results demonstrate that it outperformed existing non-stationary monitoring techniques in terms of monitoring performance.
- Research Article
- 10.1080/00401706.2025.2561744
- Oct 21, 2025
- Technometrics
- Christian Capezza + 2 more
Statistical process monitoring (SPM) methods are essential tools in quality management to assess the stability of industrial processes, that is, to dynamically classify the process state as in control, under normal operating conditions, or out of control, otherwise. Although traditional SPM methods are based on unsupervised approaches, supervised methods leverage process data with labels revealing the true process state. However, labeling procedures are often expensive, making it impractical to annotate all data points. To address this challenge, we propose a novel stream-based active learning strategy for SPM that selects the most informative data points for labeling under a limited budget. While traditional active learning methods assume independently distributed data, we explicitly account for temporal dependencies in data streams, leveraging partially hidden Markov models to integrate labeled and unlabeled observations. The proposed method balances exploration, to detect unseen process states, and exploitation, to refine classification accuracy within known states, and extends pool-based active learning to real-time settings by providing labeling decisions for each incoming data point. The proposed method’s performance in classifying the process state is assessed through a simulation and a case study on monitoring a resistance spot welding process in the automotive industry, which motivated this research.
- Research Article
- 10.3390/jpbi2040017
- Oct 16, 2025
- Journal of Pharmaceutical and BioTech Industry
- Sushrut Marathe + 7 more
The design and development of a robust and consistent manufacturing process for monoclonal antibodies (mAbs), augmented by advanced process analytics capabilities, is a key current focus area in the pharmaceutical industry. In this work, we describe the development and operationalization of multivariate statistical process monitoring (MSPM), a data-driven modelling approach, to monitor biopharmaceutical manufacturing processes. This approach helps in understanding the correlations between the various variables and is used for the detection of the deviations and anomalies that may indicate abnormalities or changes in the process compared to the historical dataspace. Therefore, MSPM enables early fault detection with a scope for preventative intervention and corrective actions. In this work, we will additionally cover the value of in silico data in the development of MSPM models, principal component analysis (PCA), and batch modelling methods, as well as refining and validating the models in real time.
- Research Article
- 10.1080/02664763.2025.2554805
- Sep 9, 2025
- Journal of Applied Statistics
- Hong-Ji Yang + 1 more
This study introduces a semiparametric method for designing individuals control charts in Phase I statistical process monitoring, aimed at ensuring guaranteed in-control performance. Motivated by the limitations of parametric and nonparametric methods under small to moderate sample sizes, the proposed approach combines extreme value theory with the exceedance probability criterion to construct control limits that adjust for estimation variability. The methodology uses Pickands and moment estimators to model extreme tail behavior, offering a more reliable solution than purely data-driven nonparametric methods. Simulation results show that a sample size of approximately n = 1112 is sufficient to achieve a nominal coverage probability of P n = 0.90 . The method demonstrates strong and consistent performance across a variety of distributions, avoids the need for computationally intensive bootstrap procedures, and provides an R package for implementation to support reproducibility and real-world applications, such as semiconductor manufacturing.
- Research Article
2
- 10.1002/qre.70041
- Aug 7, 2025
- Quality and Reliability Engineering International
- Giulio Mattera + 2 more
ABSTRACT Arc welding is classified as a special process under ISO standards, making process monitoring a critical component of the welding and additive manufacturing (AM) certification procedure. Nowadays, the advancements in data analysis have led to the growing use of Machine Learning (ML) techniques for real‐time weld quality assessment. However, due to their simple design and minimal data requirements, traditional statistical process monitoring (SPM) methods, such as control charts, remain widely used for evaluating process quality and detecting anomalies. Despite their significance, traditional SPM techniques struggle when dealing with multivariate and high‐frequency data typical of Industry 4.0 contexts, making their application challenging and highlighting the need for new approaches to data analysis. Therefore, in this study, we propose an innovative hybrid deep learning–based SPM technique for in situ monitoring of the wire arc additive manufacturing (WAAM) process, with the aim of making SPM more effective in this setting. In particular, an experimental campaign was conducted using the Invar36 alloy, and an online anomaly detection application was developed using ML methods to improve the performance of SPM. Specifically, a frequency‐informed convolutional auto‐encoder (FICA) is used as a sensor fusion technique for welding current and welding voltage data. The obtained latent space across additional temporal dimensions—which fuse the high‐frequency information in a low dimensional space—is then analysed using an exponentially weighted moving average (EWMA) chart to detect anomalies during production. The results demonstrate that the proposed methodology improves anomaly detection performance compared to conventional SPM techniques, with the F2‐score improving from 71.1% to 81.3%.
- Research Article
- 10.1088/1361-6501/adefb3
- Jul 24, 2025
- Measurement Science and Technology
- Ying Xie + 2 more
Abstract Since most multivariate statistical process monitoring (MSPM) methods require the manipulated data to follow a single peak distribution, when monitoring a multimodal industrial process generated by several different operating modes, each representing different process conditions or behaviors, the traditional MSPM methods are not applicable. To address this problem, this paper proposes a novel method for data standardization, namely the mutual k local neighborhood standardization (MKLNS) method. First, the Euclidean distance between each training sample point and the remaining sample points is calculated and subsequently sorted in ascending order. Based on this sorting result and the definition of mutual k-nearest neighbors (MkNN), the MkNN set of the training sample points is determined. Second, by discussing the number K in MkNN of the examined data, the local neighborhood standardization (LNS) equation is optimized in two cases depending on the value of K. Third, the multimodal data are standardized to have unimodal characteristics and an approximate Gaussian distribution. Finally, a principal component analysis (PCA) process monitoring model is built based on these standardized data, and the validity of the method is verified by a numerical example and Tennessee Eastman process simulations. The validation results demonstrate that the MKLNS-PCA fault detection method is more effective than the traditional methods.
- Research Article
- 10.1002/qre.70005
- Jun 28, 2025
- Quality and Reliability Engineering International
- Zahra Jalilibal + 2 more
Abstract In recent years, there has been increasing interest in simultaneously monitoring process mean and variability within the realm of statistical process monitoring (SPM). Monitoring and diagnosing faults using profile data remains challenging, especially with multichannel profiles. By jointly monitoring the mean vector and covariance matrix in multivariate processes, practitioners can reduce the inflated false alarm rates that typically arise from using two separate control charts. This paper extends two control charts, namely, the max exponentially weighted moving average (EWMA) control chart and the sum of squares EWMA (SS‐EWMA) control chart, to introduce novel single generally weighted moving average control charts for the combined monitoring of process mean and variability in multichannel profiles in Phase II. This approach integrates the EWMA scheme with multichannel functional principal components analysis to develop charting statistics for monitoring both the mean vector and covariance matrix of multichannel profiles. The performance of the proposed control charts is demonstrated through simulation studies in terms of average run length (ARL) and the standard deviation of run length criteria, as well as an illustrative example. The findings demonstrate the proposed control charts' detection efficiency across various out‐of‐control (OC) scenarios.
- Research Article
- 10.1021/acsomega.5c01880
- Jun 6, 2025
- ACS omega
- Chen Zhang + 3 more
Multivariate statistical process monitoring (MSPM) methods have been widely used in industrial processes. Due to equipment aging, load changing and unknown disturbances, actual industrial processes tend to exhibit nonstationary characteristics. However, based on stationarity assumptions, traditional MSPM methods are unable to extract nonstationary features and accurately identify faults. In this paper, a novel adaptive fault detection method is proposed for fault detection of industrial processes mixed with stationary and nonstationary variations. First, nonstationary variables are selected by the unit root test. Second, the stationary residuals of the chosen variables are derived by Johansen cointegration analysis and form a new combination matrix together with the original stationary variables. Afterward, a slow feature analysis (SFA) monitoring model is established to realize feature level fusion, and the local information increment (LII) average is introduced as the monitoring statistic. Finally, considering that LII is dynamically updated and has the strongest correlation with the latest data, dynamic control limits are set based on the fuzzy membership function. For nonstationary processes, the proposed method demonstrates significant superiority in fault detection with higher fault detection rate and lower computational complexity. The effectiveness of the proposed method is validated by the Tennessee Eastman process and electric servo mechanisms.
- Research Article
- 10.1002/bit.29039
- Jun 2, 2025
- Biotechnology and bioengineering
- Nima Sammaknejad + 7 more
Real-time multivariate statistical process monitoring (RT-MSPM) is essential to monitor health of bio-pharmaceutical processes and detect anomalies and faults early in the process. RT-MSPM methods are commonly used to monitor cell culture process operations in biologics drug substance manufacturing. Batch evolution models (BEMs) are among common RT-MSPM methods. As an alternative to BEMs, it is possible to develop multiple models to monitor different phases of a batch process. If certain statistical properties are satisfied, a multistage algorithm can be leveraged to detect steady state operation of a batch and process the corresponding time-series in a manner to leverage data from other product recipes to monitor a new product with no prior history. This is specifically useful in modern biopharmaceutical manufacturing facilities, which frequently switch from producing one medicine to another. In this article, a novel real-time deep learning framework to monitor the health of biopharmaceutical processes with no prior product-specific history is proposed. Autoencoders (AEs), in conjunction with a multistage real-time data processing algorithm, are leveraged to detect, prevent and identify the root causes of potential anomalies and faults in cell culture manufacturing processes to produce monoclonal antibodies with no prior history. A novel algorithm for real-time root cause identification of anomalies is developed to generate real-time contribution charts for AEs. The performance of the new fault detection and isolation strategy is compared with conventional methods. Given the nonlinear architecture of AEs in comparison to conventional linear methods, AEs consistently provide more robust and stronger evidence for anomalous patterns using a combination of information in residuals and latent space. The proposed framework is successfully tested within a scalable software product for real-time monitoring of manufacturing cell culture bioreactors.
- Research Article
1
- 10.1080/00224065.2025.2512164
- May 26, 2025
- Journal of Quality Technology
- Daniele Zago + 3 more
Modern applications of statistical process monitoring involve checking the stability of multivariate processes with mixed data types, such as a combination of continuous, ordinal, and categorical quality variables. Appropriate statistical modeling for such data is often challenging, especially when the observed data are serially correlated, which explains why there is only a limited existing discussion on sequential monitoring of processes with mixed data. This paper introduces a general methodology to solve the problem. The main idea behind our approach is to sequentially transform the original observed data into continuous data through innovative data pre-processing, achieved by encoding the ordinal and categorical variables into continuous numerical variables using dummy and score variables and data transformation and decorrelation. Numerical studies show that the proposed method is effective in monitoring mixed data, in comparison with some state-of-the-art existing methods. The new method is illustrated in a case study involving online monitoring of hotel customers’ behaviors. Computer codes in Julia for implementing the proposed methodology are provided in the supplemental material.
- Research Article
- 10.9734/ajpas/2025/v27i3733
- Mar 15, 2025
- Asian Journal of Probability and Statistics
- Muhammad Jibrin Zainab + 3 more
The Modified Exponentially Weighted Moving Average (M_EWMA) control chart is a novel statistical tool designed to enhance process monitoring, particularly in scenarios involving F-distributed data. This research investigates its effectiveness by analyzing control limits, process variability, and shift detection capabilities. Results indicate that the M_EWMA chart offers stable control limits and low variance under normal conditions, ensuring a reliable framework for maintaining process stability. The study evaluates the impact of the smoothing parameter λ on the chart's sensitivity. Smaller λ values result in tighter control limits, enabling the detection of minor shifts, while larger λ values allow for greater tolerance, reducing false alarms. The chart's adaptability to varying process monitoring requirements highlights its versatility across industrial applications. A comparative analysis with the method proposed by Saghir et al. (2021) demonstrates the superior performance of the M_EWMA chart. Unlike traditional approaches, the M_EWMA chart achieves immediate detection of large shifts, maintaining consistent Average Run Length (ARL₁) values irrespective of shift magnitude or λ. This finding underscores its robustness and efficiency in rapid shift detection. In conclusion, the M_EWMA control chart represents a significant advancement in statistical process monitoring. Its ability to balance sensitivity and robustness makes it an indispensable tool for modern quality control practices, offering a reliable and effective solution for detecting process deviations and maintaining operational excellence.
- Research Article
- 10.1080/02664763.2025.2455625
- Feb 14, 2025
- Journal of Applied Statistics
- Adel Ahmadi Nadi + 3 more
Online monitoring of the ratio of two random characteristics rather than monitoring their individual behaviors has many applications. For this aim, there are various control charts, known as RZ charts in the literature, e.g. Shewhart, memory-type and adaptive monitoring schemes, have been designed to detect the ratio’s abnormal patterns as soon as possible. Most of the existing RZ charts rely on two assumptions about the process: (i) both individual characteristics are normally distributed, and (ii) the direction (upward or downward) of the RZ’s deviation from its in-control (IC) state to an out-of-control (OC) condition is known. However, these assumptions can be violated in many practical situations. In recent years, applying the machine learning (ML) models in the Statistical Process Monitoring (SPM) area has provided several contributions compared to traditional statistical methods. However, ML-based control charts have not yet been discussed in the RZ monitoring literature. To this end, this study introduces a novel clustering-based control chart for monitoring RZ in Phase II. This method avoids making any assumptions about the direction of RZ’s deviation and does not need to assume a specific distribution for the two random characteristics. Furthermore, it can estimate the Change Point (CP) in the process.
- Research Article
1
- 10.1109/tii.2024.3461768
- Feb 1, 2025
- IEEE Transactions on Industrial Informatics
- Chuan Ma + 1 more
General Semi-Nonnegative Matrix Factorization and Its Application for Statistical Process Monitoring
- Research Article
- 10.1002/cem.70004
- Feb 1, 2025
- Journal of Chemometrics
- Taylor R Grimm + 2 more
ABSTRACTMultivariate statistical process monitoring is commonly used to detect abnormal process behavior in real time. Multiple process variables are monitored simultaneously, and alarms are issued when monitoring statistics exceed a predetermined threshold. Traditional approaches use a parametric threshold based on the assumptions of independence and multivariate normality of the process data, which are often violated in complex processes with high sampling frequencies, leading to excessive false alarms. Some approaches for improved threshold selection have been proposed, but they assume independence of the monitoring statistics, which are often autocorrelated. In this paper, we compare the performance of nonparametric estimators for computing thresholds from autocorrelated monitoring statistics through simulation. The false alarm rate and in‐control average run length of each estimator under different distributions, sample sizes, and autocorrelation levels and types are found. Estimator performance is found to depend on sample size and the strength of autocorrelation. The class of kernel density estimation (KDE) methods tends to perform better than estimators that use bootstrapping, and the proposed adjusted KDE methods that account for autocorrelation are recommended for general use. A case study to monitor a wastewater treatment facility further illustrates the performance of nonparametric and parametric thresholds when applied to real‐world systems.