Abstract

Recurrent neural networks could serve as surrogate material models, removing the gap between component-level finite element simulations and numerically costly microscale models. Recent efforts relied on gated recurrent neural networks. We show the limits of that approach: these networks are not self-consistent, i.e. their response depends on the increment size. We propose a recurrent neural network architecture that integrates self-consistency in its definition: the Linearized Minimal State Cell (LMSC). While LMSCs can be trained on short sequences, they perform best when applied to long sequences of small increments. We consider an elastoplastic example and train small models with fewer than 5000 parameters that precisely replicate the deviatoric elastoplastic behavior, with an optimal number of state-variables. We integrate these models into an explicit finite element framework and demonstrate their performance on component-level simulations with tens of thousands of elements and millions of increments.

Highlights

  • We focus our attention to the Gated Recurrent Units (GRUs) models of width 40 and the Line­ arized Minimal State Cell (LMSC) model with 40 units

  • On the smooth test dataset, the best models appear to reach a plateau in performance (Fig. 7d) and the effect of adding state-variables beyond 25 is not clear. It appears that when trained on large increments, LMSCs compensate for the restrictions imposed to their state variable update rules by distributing individual state-variables over several memory slots

  • We report the effect of the number of state variables on model performance in Fig. 8a–d, for both approaches and both datasets

Read more

Summary

Introduction

Recent years have seen the application of novel machine learning methods to the modeling of materials in general (Bock et al, 2019), and of history-dependent mechanical behavior in particular. Several studies have integrated Fully Connected Neural Networks (FCNNs) in established phenomenological frameworks, replacing parameter-dependent functions (such as hardening curves) and providing enough flexibility to capture complex experimental data Beyond augmenting the capabilities of existing models, one short-term research goal is to use neural networks to replace phenomenological models altogether. FCNNs can be used to that effect when the material’s state-space is fully understood, i.e. when the problem is fully specified The state-variables necessary to capture history-dependent behavior are unknown, preventing the use of FCNNs

Objectives
Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.