Abstract

The Digital Twin (DT) constitutes an integration between cyber and physical spaces and has recently become a popular concept in smart manufacturing and Industry 4.0. The related literature provides a DT characterisation and identifies the problem of updating DT models throughout the product life cycle as one of the knowledge gaps. The DT must update its performance by analysing the variable data in real time of the physical asset, whose behaviour is constantly changing over time. The automatic update process involves a data quality problem, i.e., ensuring that the captured values do not come from measurement or provoked errors. In this work, a novel methodology has been proposed to achieve data quality in the interconnection between digital and physical spaces. The methodology is applied to a real case study using the DT of a real solar cooling plant, acting as a learning decision support system that ensures the quality of the data during the update of the DT. The implementation of the methodology integrates a neurofuzzy system to detect failures and a recurrent neural network to predict the size of the errors. Experiments were carried out using historical plant data that showed great results in terms of detection and prediction accuracy, demonstrating the feasibility of applying the methodology in terms of computation time.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call