Deep digital twins (DDTs) are deep neural networks that encode the behavior of complex physical systems. DDTs are excellent system representations due to their ability to continuously adapt to operational changes and their capability to capture complex relationships between system components and processes that cannot be explicitly modeled. For this challenge, DDTs benefit greatly from recent success in geometric deep learning (GDL) which allows the integration of information from multiple systems based on schematic representations. A major challenge in training DDTs is their dependence on the quality and representativeness of training data, especially under the dynamic conditions typical in prognostics and health management (PHM). Recent developments in differentiable simulation present new opportunities for optimizing the training data representativeness. In this thesis, we propose a novel meta-learning framework that trains DDTs using the output from differentiable simulators. This setup enables active optimization of training data sampling through gradient computation, enhancing training speed, robustness, and data representativeness. We extend this framework to address challenges in multi-system data integration in power grids and fault detection in railway traction networks. By applying our framework, we aim to tackle significant challenges in forecasting, anomaly detection and sensor-fault analysis using advanced data fusion techniques. Our approach promises substantial improvements in DDT robustness and operational efficiency, with its effectiveness to be demonstrated through empirical studies on both simple and complex case studies within the power systems domain.
Read full abstract