Abstract

Given the mathematical idealization of a structure, modeling introduces all the approximations which vanish as the number of unconstrained degrees of freedom of the finite element grid approaches infinity. As a practical matter, the computer code selected to support the analysis limits the definition of the idealized model and hence, the accuracy of the analysis. Sensitivity analysis can furnish information on the effect of these limitations on structural response. However, by definition, the judgement exercised by the analyst in idealization is unassailable by the computer program. That is, we assume that during a finite element analysis the computer has no access to alternative unprogrammed idealizations. A very important example of idealization is the choice of the constitutive equations and the choice of the coefficients of its particularization. Choices affecting the relevance of the element model include use of Euclidean geometry to define the length, volume, and deformations of the structure and the coefficients chosen to particularize the geometry. The assumption of frictionless pins or spring supports, the limitation of the distribution of applied loadings, and the mathematical model of nodal displacement constraints are other examples of idealization choices. Examples of modeling approximations include element and/or node approximations. Use of straight lines to represent curves, polygons or hyperbolas to model circular edges, and stepped surface thicknesses typify approximations to the original geometry. Use of polynomials with few terms to represent deformations is the most important modeling approximation of the deformed geometry. Modeling applied loading and settlements by linear functions of element local coordinates illustrates modeling approximations of boundary conditions. By definition, when the modeling is optimum, the required computer solution accuracy is developed with a minimum of computer resources. The parameters of this optimization problem are the modeling choices, the analysis strategy and the efficiency of the computer configuration in implementing the choices. In this study, we assume that the constraint on the optimum is evaluation of the external work in the structure to a prespecified number of significant digits. We measure the efficiency of the computer configuration by the relative number of degrees of freedom in the analysis. Melosh [1] emphasizes that conventional analysis with element models satisfying minimum convergence requirements can be much less efficient than analysis using curve fitting. Examples suggest that use of fitting can reduce the number of calculations by more than two orders of magnitude and the storage requirements by more than three orders. Furthermore, the previous study indicates that using rational polynomials or hyperbolic estimating functions for curve fitting offers no advantage over polynomials.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call