Abstract

Monitoring of complex industrial processes can be achieved by obtaining process data by utilising various sensing modalities. The recent emergence of deep learning provides a new routine for processing multi-sensor information. However, the learning ability of shallow neural networks is insufficient, and the data amount required by deep networks is often too large for industrial scenarios. This paper provides a novel deep transfer learning method as a possible solution that offers an advantage of better learning ability of the deep network without the requirement for a large amount of training data. This paper presents how Transformer with self-attention trained from natural language can be transferred to the sensor fusion task. Our proposed method is tested on 3 datasets: condition monitoring of a hydraulic system, bearing, and gearbox dataset. The results show that the Transformer trained from natural language can effectively reduce the required data amount for using deep learning in industrial sensor fusion with high prediction accuracy. The difficult and uncertain artificial feature engineering which requires a large workload can also be eliminated, as the deep networks are able to extract features automatically. In addition, the self-attention mechanism of Transformer aids in the identification of critical sensors, hence the interpretability of deep learning in industrial sensor fusion can be improved.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call