For more than 100 years, chemical, physical, and material scientists have proposed competing constitutive models to best characterize the behavior of natural and man-made materials in response to mechanical loading. Now, computer science offers a universal solution: Neural Networks. Neural Networks are powerful function approximators that can learn constitutive relations from large data without any knowledge of the underlying physics. However, classical Neural Networks entirely ignore a century of research in constitutive modeling, violate thermodynamic considerations, and fail to predict the behavior outside the training regime. Here we design a new family of Constitutive Artificial Neural Networks that inherently satisfy common kinematical, thermodynamical, and physical constraints and, at the same time, constrain the design space of admissible functions to create robust approximators, even in the presence of sparse data. Towards this goal we revisit the non-linear field theories of mechanics and reverse-engineer the network input to account for material objectivity, material symmetry and incompressibility; the network output to enforce thermodynamic consistency; the activation functions to implement physically reasonable restrictions; and the network architecture to ensure polyconvexity. We demonstrate that this new class of models is a generalization of the classical neo Hooke, Blatz Ko, Mooney Rivlin, Yeoh, and Demiray models and that the network weights have a clear physical interpretation in the form of shear moduli, stiffness-like parameters, and exponential coefficients. When trained with classical benchmark data for rubber under uniaxial tension, biaxial extension, and pure shear, our network autonomously selects the best constitutive model and learns its set of parameters. Our findings suggest that Constitutive Artificial Neural Networks have the potential to induce a paradigm shift in constitutive modeling, from user-defined model selection to automated model discovery. Our source code, data, and examples is available at https://github.com/LivingMatterLab/CANN.
Read full abstract