Abstract

Batch processing is a widely utilized technique in the manufacturing of high-value products. Traditional methods for quality assessment in batch processes often lead to productivity and yield losses because of offline measurement of quality variables. The use of soft sensors enhances product quality and increases production efficiency. However, due to the uneven batch data, the variation in processing times presents a significant challenge for building effective soft sensor models. Moreover, sensor failures, exacerbated by the manufacturing environment, complicate the accurate modeling of process variables. Existing soft sensor approaches inadequately address sensor malfunctions, resulting in significant prediction inaccuracies. This study proposes a fault-tolerant soft sensor algorithm that integrates two Long Short-Term Memory (LSTM) networks. The algorithm focuses on modeling process variables and compensating for sensor failures using historical batch quality data. It introduces a novel method for converting quality variables into process rates to align uneven batch data. A case study on simulated penicillin production validates the superiority of the proposed algorithm over conventional methods, showing its capacity for precise endpoint detection and effectiveness in addressing the challenges of batch process quality assurance. This study offers a robust solution to the issues of soft sensor reliability and data variability in industrial manufacturing.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call