Hybrid modeling architectures seek to combine a machine learning model with a computationally efficient (simplified or partial) physics model to predict the behavior of physical systems. Existing sequential or parallel approaches to hybrid modeling do not typically exploit the potential relationship between the input or latent features of the partial and full physics. In addition, very few existing architectures take advantage of the provision to generously or on-demand sample the partial physics model. To address these gaps, we have developed a novel neural network-based hybrid architecture called “Opportunistic Physics-mining Transfer Mapping Architecture” or OPTMA. The goal of the OPTMA architecture is to facilitate greater exploitation of input space correlations between partial and full physics where they exist. To this end, a transfer neural network is used to transform the original inputs into modified inputs or latent features, where the partial physics operates on these artificially transformed features to produce the final prediction. An extended back-propagation approach and a Particle Swarm Optimization (to deal with multimodal loss functions) are used to train the network weights. The new architecture is first tested on a simple regression problem for analysis. It is then used to predict the behavior of more complex dynamic systems — an Inverted pendulum and the motion of an unmanned aerial vehicle, both under wind effects. Subsequent tests on unseen samples demonstrate OPTMA’s competitive performance compared to pure ANN and sequential hybrid models and provide empirical validation of the transfer concepts underlying OPTMA.
Read full abstract