Abstract

In the design of steel members based on beam theory, normal stresses in cross sections are generally determined based on Bernoulli’s hypothesis for reasons of simplicity, neglecting shear strain influences (shear lag). However, in certain civil engineering applications, such as cross sections with wide flanges commonly applied in bridge engineering, shear lag can significantly influence normal stress states. To capture corresponding effects in practical applications, methods of effective widths have been developed and introduced into various standards. For calculating stress distributions affected by shear lag and avoiding the effective widths method, finite element (FE) models involving shell or solid elements may be applied. However, setting up complex FE models is time-consuming and their application may require high computational effort. For avoidance in practical applications, this paper presents a novel stress calculation approach based on machine learning (ML). In recent years, ML methods have increasingly been applied in civil engineering, as they automatically detect correlations in data and adapt to new application boundaries dynamically. The approach proposed aims to acquire accurate normal stress distributions within steel members considering shear lag effects using ML. For implementing the approach, neural networks are employed as supervised ML model using training data generated by FE calculations (shell models). The performance of the approach is validated using cross sections with wide flanges revealing high accuracy of the neural network. By subsequently interpreting the ML model, influences of different parameters on normal stress distributions, such as cross section parameters, are quantified, providing deeper understanding of the mechanical problem solved by ML.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call