Abstract

Advances in machine learning have revolutionized capabilities in applications ranging from natural language processing to marketing to health care. Recently, machine learning techniques have also been employed to learn physics, but one of the formidable challenges is to predict complex dynamics, particularly chaos. Here, we demonstrate the efficacy of quasi-recurrent neural networks in predicting extremely chaotic behavior in multistable origami structures. While machine learning is often viewed as a “black box”, we conduct hidden layer analysis to understand how the neural network can process not only periodic, but also chaotic data in an accurate manner. Our approach shows its effectiveness in characterizing and predicting chaotic dynamics in a noisy environment of vibrations without relying on a mathematical model of origami systems. Therefore, our method is fully data-driven and has the potential to be used for complex scenarios, such as the nonlinear dynamics of thin-walled structures and biological membrane systems.

Highlights

  • Advances in machine learning have revolutionized capabilities in applications ranging from natural language processing to marketing to health care

  • We show that a unique triangulated cylindrical origami (TCO)-based platform can produce rich data sets from its complex dynamics, especially chaos, enabling to examine the effectiveness of our data-driven approach

  • By utilizing experimentally measured data, we have trained the quasi-recurrent neural networks (QRNNs) composed of three hidden layers and demonstrated the effectiveness of predicting chaotic/periodic time series

Read more

Summary

Introduction

Advances in machine learning have revolutionized capabilities in applications ranging from natural language processing to marketing to health care. Recurrent neural networks (RNNs) constitute a powerful machine learning approach for processing and predicting time-series data[9,10,11] (see Supplementary Fig. 1a for schematic illustration of a standard RNN) Due to such capabilities, RNNs or their variations have been applied for dynamics problems[12,13]. Quasi-recurrent neural networks (QRNNs) have been developed, for natural language analysis (see Supplementary Note 1 and Supplementary Fig. 1c)[20] They exhibit faster processing of time-series data and competitive performance compared with other RNNs. Most notably, the hidden states of a QRNN can be readily visualized and interpreted without additional processing (e.g., introducing selfattention to visualize how the input data are processed[21]). Since no mathematical model of the system nor the knowledge of its dynamical nature (e.g., definition of chaos) is required, our data-driven approach is model-free and can be used to analyze complex dynamics in absence of prior knowledge of the underlying physics of a system

Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.