Abstract

Recurrent neural network (RNN) is a popular modeling choice for sequential data. However, empirical experience shows RNNs are often difficult and time-consuming to tune and customize. It drives practitioners to replace RNNs with non-recurrent neural networks which presented comparable performances in some cases. The success of using non-recurrent neural networks to model sequential data indicates the short-term dependency among sequential data. In this paper, we systematically analyze the short-term dependency in ultra-high magnetic response systems (UHMR) with partial system knowledge based on the observability criterion of the dynamic systems. Moreover, we show that the sequential data in the UHMR system only have 2-step dependency. This result indicates that any consecutive three steps in an experiment form a datum to train a feed-forward neural network (FFNN). Therefore, sufficient data can be collected within a small number of experiments. Based on the analysis, we train a feed-forward neural network to model the UHMR system based on the sequential data from four experiments. Through proper data pre-processing, the FFNN model can predict the system performance with bounded mean absolute error.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.