Multisource Heterogeneous Data Fusion Methods Driven by Digital Twin on Basis of Prophet Algorithm
With the development of intelligent manufacturing and the wider application of the Internet of Things (IoT), it is crucial to fuse heterogeneous sensor data from multiple sources. However, the current data fusion methods still have problems, such as low accuracy of fused data, insufficient data integrity, poor data fusion efficiency, and poor scalability of fusion methods. In response to these issues, this article explores a multisource heterogeneous data fusion method based on the Prophet algorithm digital twin drive to improve the fusion effect of sensor data and provide more support for subsequent decision‐making. The article first used curve and sequence alignment to extract data features and then analyzed the trend of data changes using the Prophet algorithm. Afterward, this article constructed a digital twin model to provide analytical views and data services. In conclusion, this paper used tensor decomposition to merge text and image data from sensor data. Deep learning algorithms and Kalman filtering techniques were also examined to confirm the efficacy of data fusion under the Prophet algorithm. The experimental results showed that after fusing the data using the Prophet algorithm, the average accuracy can reach 92.63%, while the average resource utilization at this time was only 9.97%. The results showed that combining Prophet with digital twin technology can achieve higher accuracy, fusion efficiency, and better scalability. The research in this paper can provide new ideas and means for the fusion and analysis of heterogeneous data from multiple sources.
- Research Article
40
- 10.3390/buildings13112725
- Oct 29, 2023
- Buildings
Effective civil infrastructure management necessitates the utilization of timely data across the entire asset lifecycle for condition assessment and predictive maintenance. A notable gap in current predictive maintenance practices is the reliance on single-source data instead of heterogeneous data, decreasing data accuracy, reliability, adaptability, and further effectiveness of engineering decision-making. Data fusion is thus demanded to transform low-dimensional decisions from individual sensors into high-dimensional ones for decision optimization. In this context, digital twin (DT) technology is set to revolutionize the civil infrastructure industry by facilitating real-time data processing and informed decision-making. However, data-driven smart civil infrastructure management using DT is not yet achieved, especially in terms of data fusion. This paper aims to establish a conceptual framework for harnessing DT technology with data fusion to ensure the efficiency of civil infrastructures throughout their lifecycle. To achieve this objective, a systematic review of 105 papers was conducted to thematically analyze data fusion approaches and DT frameworks for civil infrastructure management, including their applications, core DT technologies, and challenges. Several gaps are identified, such as the difficulty in data integration due to data heterogeneity, seamless interoperability, difficulties associated with data quality, maintaining the semantic features of big data, technological limitations, and complexities with algorithm selection. Given these challenges, this research proposed a framework emphasizing multilayer data fusion, the integration of open building information modeling (openBIM) and geographic information system (GIS) for immersive visualization and stakeholder engagement, and the adoption of extended industry foundation classes (IFC) for data integration throughout the asset lifecycle.
- Research Article
22
- 10.1007/s11042-022-13231-1
- May 27, 2022
- Multimedia Tools and Applications
The study of the prediction of stock market volatility is of great significance to rationally control financial market risks and increase excessive investment returns and has received extensive attention from academic and commercial circles. However, as a dynamic and complex system, the stock market is affected by multiple factors and has a comprehensive capability to include complex financial data. Given that the explanatory variables of influencing factors are diverse, heterogeneous and complex, the existing intelligent algorithms have great limitations for the analysis and processing of multi-source heterogeneous data in the stock market. Therefore, this study adopts the edge weight and information transmission mechanism suitable for subgraph data to complete node screening, the gate recurrent unit (GRU) and long short-term memory (LSTM) to aggregate subgraph nodes. The compiled data contain the metapaths of three types of index data, and the introduction of the association relationship attention dimension effectively mines the implicit meanings of multi-source heterogeneous data. The metapath attention mechanism is combined with a graph neural network to complete the classification of multi-source heterogeneous graph data, by which the prediction of stock market volatility is realized. The results show that the above method is feasible for the fusion of heterogeneous stock market data and the mining of implicit semantic information of association relations. The accuracy of the proposed method for the prediction of stock market volatility in this study is 16.64% higher than that of the dimensional reduction index and 14.48% higher than that of other methods for the fusion and prediction of heterogeneous data using the same model.
- Research Article
2
- 10.1002/sdtp.16992
- Apr 1, 2024
- SID Symposium Digest of Technical Papers
Urban centers serve as dynamic hubs of data and information, continually shaping the modern landscape. The fusion of Big Data and Digital Twin (DT) technology plays a pivotal role in advancing smart city initiatives. DT, acting as a comprehensive virtual replica mirroring physical entities' lifecycles, utilizes real‐time data, simulations, and machine learning to enrich decision‐making processes. In urban development, Big Data assumes diverse roles, particularly in urban planning, resource management, and traffic optimization, providing valuable, data‐driven insights to decisionmakers. Simultaneously, DT technology contributes significantly to modeling urban environments, enabling real‐time simulations, and strengthening decision support systems. However, challenges persist, notably in data security and model precision. Addressing these challenges necessitates concerted efforts to enhance data privacy measures and refine the cognitive capabilities of DT models. This paper examines the intricate interplay between Big Data and DT technology in shaping the evolution of smart cities, offering insights into their roles, applications, and implementation challenges. Furthermore, it advocates for future research endeavors aimed at overcoming existing obstacles, thereby fostering secure and effective deployment of Big Data‐driven DT technology and promoting innovative advancements in smart city management and sustainable development.
- Research Article
1
- 10.1007/s42452-025-06725-8
- Mar 26, 2025
- Discover Applied Sciences
As a key critical of the power system, a substantial number of intelligent devices are deployed in the substation. The data generated by these devices exhibit exponential growth, and the data types are diverse and complex, encompassing both structured and unstructured data. The integration of multi-source heterogeneous data is of significant importance for enhancing the operational efficiency, fault early warning capability, and intelligent level of the substations. However, the integration of multi-source heterogeneous data in substations has consistently presented a challenge for AI technology intervention. The effective integration of data from different platforms remains a significant challenge at present. Consequently, this paper proposes a multi-source heterogeneous data fusion method for substations based on cloud edge collaboration and AI technology. This method is based on artificial intelligence cloud services and constructs a substation cloud edge collaborative network based on AI Cloud architecture. It utilizes the 5G PaaS platform to establish substation cloud PaaS and edge PaaS, respectively, and employs a dynamic task scheduling strategy for substation cloud edge collaboration to achieve the collaboration of multi-source heterogeneous data in the substation cloud and edge. A heterogeneous data resource pool is established, and the data fusion module uses the dynamic Bayes network model in AI technology to achieve the fusion of multi-source heterogeneous data in the substation. The experimental results demonstrate that the method proposed in this paper can more effectively control the energy consumption of edge nodes, exhibits high efficiency in data fusion, and can effectively integrate and display multi-source heterogeneous data. All information gain values exceed 0.96, with the integrity value being the highest, approaching 100%. The method demonstrates a strong capability in fusing multi-source heterogeneous data of substations.
- Conference Article
- 10.31705/wcs.2023.70
- Jul 21, 2023
The digital twin (DT) presents an opportunity for the integration of the physical world into the digital world. DT technology has the potential to transform the construction industry and respond to some of its challenges. In conventional construction projects, progress is largely monitored by direct observation and measurement which suffers from numerous challenges, including low productivity, blunders, and poor technology advancements. Concerns are now being raised about integrating technology for autonomously monitoring building activity. In other sectors, DT technology has been responsible for saving product development time and costs by up to 50%. However, DT is still lagging the adoption of new technologies in the construction industry. The overarching aim of this study was to explore the adaptability of DT in construction site progress monitoring. This study comprehensively reviews and analyses DT concepts, technologies, and applications in the construction industry, parameters of applications of DT in construction site progress monitoring, how DT could be used for site progress monitoring in construction, common challenges in the implementation of DT in site progress monitoring, and strategies such as barriers related to DT in site progress monitoring, using literature findings while incorporating qualitative analysis of semi-structured interviews. This research shows that DT has a high potential to solve the numerous challenges in construction site progress monitoring, rather than other current technologies in use. Thus, this study raises awareness and the need for the application of DT in construction site progress monitoring
- Conference Article
5
- 10.1109/bibm.2018.8621165
- Dec 1, 2018
Discovering dynamic modules associated with disease progression is critical for revealing the mechanisms of cancers, which is the foundation for cancer diagnosis and therapy. And, advances in biological technology make it possible to generate heterogeneous data for cancers. Thus, it is promising to to discover cancer related modules to capture the dynamics of pathways via fusing heterogeneous genomic data. Even though great efforts have been devoted to either the dynamic module detection or heterogeneous data fusion, as far as we know, no attempt has been developed to extract dynamic modules via data fusion. In this study, we propose a general framework to detect dynamic modules based on nonnegative matrix factorization method (aka DrNMF), where the gene expression and protein interaction network are integrated. Specifically, to obtain the dynamic networks associated with cancer progression, we construct a gene co-expression network for each clinical stage of cancers. To extract the dynamic modules for each stage, we factorize the adjacency matrices at two subsequent stages, where the protein interaction network is integrated via regularization. Experimental results demonstrate that the proposed algorithm is more accurate than state-of-the-art methods in terms of accuracy in simulated data. By applying DrNMF to the breast cancer data, the obtained dynamic modules are more enriched by the known pathways. The proposed model and algorithm provide an effective way for the integrative analysis of heterogeneous genomic data to investigate dynamics of cancers.
- Research Article
2
- 10.1002/itl2.511
- Feb 16, 2024
- Internet Technology Letters
The Internet of Things (IoT) technology can currently enable devices and systems in various fields to achieve interconnectivity, intelligence, and automation, which is significant for improving daily life. It connects objects through the Internet, achieving information exchange and sharing, bringing many conveniences to humanity, and improving the efficiency and quality of various industries. However, precisely because everything is interconnected, most IoT systems have high data throughput, which leads to issues such as reduced operational efficiency of IoT systems. Therefore, this article used digital twin (DT) technology to aggregate multi‐source heterogeneous data of the IoT, overcoming the problems of diversity and differences in massive data, and thus accelerating the system's data processing. Moreover, in the end of this article, an experiment was conducted on the IoT system of a certain university. Taking the system running 10 times as an example, the packet loss rate of the experimental group using DT technology was only 3.48%, while the packet loss rate of the control group running alone was 4.36%. This indicates that DT technology has improved the performance of the IoT system. This study highlights the role of digital twin technology in solving the low operational efficiency, diverse data, and data differences in data aggregation of the Internet of Things. It plays a significant role in improving the operational efficiency of the Internet of Things and improving the performance of the Internet of Things system.
- Research Article
1
- 10.1080/0951192x.2025.2461034
- Feb 14, 2025
- International Journal of Computer Integrated Manufacturing
Digital twin technology has recently been introduced as one of the emerging technologies within the Industry 4.0 framework. Digital twin technology opens a new prospective approach for a more prognostic tool wear monitoring system since the dynamics of tool conditions can be monitored and predicted by running the digital twin. Nevertheless, more research is needed on utilizing digital twin technology for monitoring tool wear. In the past, many systems used primarily data-driven methods to provide wear monitoring services within the digital twin application. The framework for representing the physical object consisted of ‘black-box’ data-driven models that described the relationship between sensor data and wear value. Implementing the digital twin principle, which entails a dynamic representation and up-to-date information of physical entities, has been limited. This paper presents a micro-milling digital twin that simulates the micro-milling process dynamics based on physics-based models, such as spindle motor, spindle controller, and cutting torque models. The proposed digital twin is an alternative to prior tool wear monitoring systems, primarily based on indirect data-driven or direct visual approaches. The proposed digital twin has shown potential benefits and a new direction for the tool wear monitoring field by adapting emerging technology within the Industry 4.0 ecosystem.
- Research Article
2
- 10.31629/jit.v2i2.3507
- Oct 31, 2021
- Journal of Innovation and Technology
Developments in virtual technology and data acquisition technology put way to digital twin (DT) technology. Digital twin is a virtual entity that is linked to a real-world entity. Both the link and the virtual representation can be realized in several different ways. Digital Technology plays a very much key role in different areas like in production management, manufacturing, health care, smart cities and so on. Mainly Digital Twin Technology is developed to improve manufacturing processes. With the development of new-generation information and digitalization technologies, more data can be collected, and it is time to find a way for the deep application of all these data. As a result, the concept of digital twin has aroused much concern and is developing rapidly. Digital twins facilitate to monitor, understand, and optimize the functions of all physical entities and for humans and also provide continuous feedback to improve quality of life and well-being. Digital Twin is best described as the effortless integration of data between a physical and virtual machine in either direction. This paper provides an overview of the Digital Twin technology used in different work spaces and also how it will be effective in the Internet of Things network.
- Book Chapter
- 10.4018/979-8-3693-4199-5.ch002
- Mar 14, 2025
Digital Twin (DT) technology has been employed as an innovator prototype in all industries; it adds to the creation of a virtual picture of physical facilities, processes and systems. This concept which evolved from the engineering areas and the manufacturing industries has extended to other fields of operation like manufacturing, health, transport, and agricultural and urban development fields. Real-time data, stream acquisition, modeling and simulation, operation optimization, decision-making improvement, analytics, and DTs allow businesses to achieve better insight. Next generation DT means next generation DT is an innovation of DT. This paper provides brief description on what DT technology is and the DT technology of the next generation, the elements of the DT at the center and how DT works. Furthermore, we consider the problems, prospects, and tendencies of the application of DTs. Hence, the paper presents a focus on the DT enabled machine learning architecture, security concerns and remedies. To justify that DT are useful for designing the future of interconnected data driven systems, examples of articles and industry are presented.
- Research Article
2
- 10.61356/j.nswa.2024.19326
- Jul 1, 2024
- Neutrosophic Systems with Applications
Smart city sustainability initiatives prioritize creating environmentally, economically, and socially sustainable urban environments. Digital Twin (DT) technology creates precise digital replicas of physical assets, systems, or processes. These digital twins play a crucial role in advancing the goals of smart city sustainability. This paper explores the development and application of DT technology for integrated regional energy systems in smart cities, emphasizing its potential to optimize energy consumption, reduce costs, and enhance overall system performance. The CloudIEPS platform, an energy internet planning platform based on digital twin technology, is a great example of how digital twin technology can be applied in practice, helping optimize energy efficiency and reduce costs. Integrating digital twin technology with the Multi-Criteria Decision-Making (MCDM) methods offers a novel approach to managing and optimizing energy systems in smart cities. The paper aims to create a consistent and robust approach to determining the best digital twin solution for energy systems in smart cities. The paper identifies critical factors for decision-making and establishes a method for assessing the significance of criteria using Triangular Neutrosophic Sets (TNS) through the MEthod based on Removal Effects of Criteria (MEREC) and the Multi-Attributive Ideal Real Comparative Analysis (MAIRCA) approach. These methods are used to evaluate and prioritize multiple criteria in decision-making processes. Furthermore, the methods are combined with Triangular Neutrosophic Sets (TNS) to support decision-making for smart cities' energy systems, better accounting for the complex and uncertain nature of energy systems. A case study is conducted to apply and validate the developed methodology and perform a sensitivity analysis of the experimental results. The research outcomes indicated that the proposed methodology is robust and effective in handling the uncertainty and complexity inherent in smart cities' energy systems. The sensitivity analysis further confirms the stability and adaptability of the proposed methodology across different scenarios, making it a valuable tool for policymakers and stakeholders in the energy sector.
- Research Article
- 10.1115/1.4070329
- Dec 1, 2025
- Journal of Computing and Information Science in Engineering
In recent years, the digital twin (DT) technology has become the subject of significant excitement as its potential to revolutionize many different industries has become clear. However, excitement around DT has led to an eagerness to label many models, simulations, and control systems (CSs) as DTs when they do not meet the strict definition of a DT. We posit that this mislabeling is partly due to the lack of a clear consensus on what is and is not a DT. In order to clarify the differences between DT and CS, this article reviews a number of current DT definitions for practical applicability, develops a refined definition, provides a method of differentiating between DTs and CSs, and presents use case studies to demonstrate the effectiveness of the method.
- Research Article
20
- 10.3390/smartcities7050101
- Sep 10, 2024
- Smart Cities
Digital Twin (DT) technology is a pivotal innovation within the built environment industry, facilitating digital transformation through advanced data integration and analytics. DTs have demonstrated significant benefits in building design, construction, and asset management, including optimising lifecycle energy use, enhancing operational efficiency, enabling predictive maintenance, and improving user adaptability. By integrating real-time data from IoT sensors with advanced analytics, DTs provide dynamic and actionable insights for better decision-making and resource management. Despite these promising benefits, several challenges impede the widespread adoption of DT technology, such as technological integration, data consistency, organisational adaptation, and cybersecurity concerns. Addressing these challenges requires interdisciplinary collaboration, standardisation of data formats, and the development of universal design and development platforms for DTs. This paper provides a comprehensive review of DT definitions, applications, capabilities, and challenges within the Architecture, Engineering, and Construction (AEC) industries. This paper provides important insights for researchers and professionals, helping them gain a more comprehensive and detailed view of DT. The findings also demonstrate the significant impact that DTs can have on this sector, contributing to advancing DT implementations and promoting sustainable and efficient building management practices. Ultimately, DT technology is set to revolutionise the AEC industries by enabling autonomous, data-driven decision-making and optimising building operations for enhanced productivity and performance.
- Research Article
1
- 10.54097/jtpzzy16
- Nov 7, 2024
- Frontiers in Business, Economics and Management
The pharmaceutical supply chain is a multilayered and complex structure designed to deliver medicines to customers in a timely manner while ensuring optimal quality and quantity of medicines. The COVID-19 outbreak exposed the vulnerability and uncertainty of the pharmaceutical supply chain, so managing the risks in the pharmaceutical supply chain has become particularly important. This study demonstrates that digital twin (DT) technology can improve pharmaceutical supply chain agility and reduce the ripple effect caused by disruptions. The ripple effect refers to the effect that a sudden disruption at one node in the supply chain has on causing a chain reaction in the rest of the supply chain. The most used risk management method today is the Enterprise Resource Planning (ERP) system, which allows for real-time data sharing but has limitations in predicting and modeling the entire supply chain and lacks the ability to make quick decisions and simulate the entire supply chain. DT technology creates a virtual model of the supply chain, which enables continuous communication and information exchange between real assets and the virtual model, offering a promising alternative to risk management for pharmaceutical supply chains. risk management by providing a promising alternative. This study utilizes the AnyLogistix platform to construct a DT model and demonstrates the effectiveness of DT technology in reducing ripple effects and improving supply chain agility using quantitative analysis. This paper focuses on analyzing the pharmaceutical supply chain for 75mg aspirin tablets in London. A supply chain that operates normally, a supply chain that introduces a disruptive event (closure of the logistics center due to an earthquake), and a supply chain that applies a proactive strategy (activation of an alternate logistics center) to mitigate risk are simulated separately. The positive impact of DT technology on the supply chain is evaluated by analyzing the methods of key performance indicators (KPIs) such as inventory level, order fulfillment rate, and delivery time. The experimental results show that DT technology enhances the responsiveness, flexibility, speed, and proactivity of the supply chain, which significantly improves the agility of the supply chain. Proactive and effective strategies also reduce the financial impact of disruptions and service levels are quickly recovered. In addition, the proactive strategy stabilized inventory levels and reduced delayed orders. The above key metrics confirm the hypothesis that DT technology can improve supply chain agility and effectively reduce the ripple effect. Despite the significant conclusions drawn in this paper, there are some limitations. Firstly, sensitivity analysis and t-test were not used in the study for secondary testing, which resulted in the lack of reliability of the validation results. In addition, the possible interaction between the ripple effect and the bullwhip effect was not discussed, leading to a confounding factor in the experimental results. The above issues can be considered in further research in the future. Finally, this paper explores the possibilities and uses of introducing machine learning techniques and considers the supply chain risk factors of globalization. In summary, this study validates the potential of digital twin technology in improving agility and coping with ripple effects in pharmaceutical supply chains. As research continues, DT technology is expected to improve supply chain efficiency and reduce supply chain risk while ensuring patient safety.
- Research Article
- 10.1007/s40194-025-02245-6
- Nov 12, 2025
- Welding in the World
This study proposes an integrated approach for optimizing the friction stir welding (FSW) process by combining sensor-based data acquisition, machine learning (ML), and digital twin (DT) technologies. Real-time sensor data including rotational speed, welding speed, axial force, torque, and temperature were collected during FSW operations. These parameters were correlated with weld quality indicators, such as surface appearance, internal defects, and tensile strength. A dataset of 132 weld samples was used to train supervised and unsupervised ML models, achieving a defect classification accuracy of 95%. In parallel, a COMSOL-based digital twin was developed to simulate thermo-mechanical aspects of the welding process. The model incorporated temperature-dependent material properties, frictional heat generation, and plastic deformation behavior to predict stress, strain, and temperature distributions. Model predictions were validated against experimental sensor data, confirming accuracy in peak temperature and torque estimation. The integrated ML-DT system functioned as a decision-support tool, enabling real-time process monitoring, virtual experimentation, and predictive defect detection. When implemented in an industrial environment, the system dynamically adapted welding parameters to maintain optimal conditions. This approach enhances process stability, reduces material waste, and improves weld integrity, offering a scalable solution for intelligent manufacturing and Industry 4.0 applications.
- Ask R Discovery
- Chat PDF
AI summaries and top papers from 250M+ research sources.