AbstractHilber and Hughes gave several competitive demands on the implicit methods in 1978. However, there are no such integration methods in the traditional algorithm design. Via imposing subsidiary variables, this article proposes two novel implicit methods to splendidly achieve these competitive demands without increasing computational costs. The two novel methods are self‐starting, unconditionally stable, single‐solve, identically second‐order accurate, controllably dissipative, and non‐overshooting. The novel methods achieve the maximal dissipation of auxiliary variables in the high‐frequency limit, although such the design is not a must. The first novel method uses auxiliary velocity and acceleration variables, whereas the second one employs two auxiliary acceleration variables. After embedding desired numerical characteristics, the novel methods employ four eigenvalues of amplification matrices in the high‐frequency limit as user‐specified parameters to flexibly control numerical high‐frequency dissipation. To reduce the number of use‐specified parameters, this article gives some recommended sub‐families of algorithms, such as optimal low‐frequency dissipation schemes. The article further refines the error analysis to analytically calculate amplitude and phase errors for some implicit methods. The error analysis technique can be applied to almost all direct time integration methods. Comparisons of amplitude and phase errors highlight the superiority of the novel methods over the published algorithms. In addition, some classical measures, such as numerical damping ratios and relative period errors, are analyzed and compared. Numerical examples are solved to show the novel methods' superiority.
Read full abstract