The staggered grid finite difference method has emerged as one of the most commonly used approaches in finite difference methodologies due to its high computational accuracy and stability. Inevitably, discretizing over time and space domains in finite difference methods leads to numerical artifacts. This paper introduces a novel approach that combines the widely used Taylor series expansion with the least squares method to effectively suppress numerical dispersion. We have derived the coefficients for the staggered grid finite difference method by integrating Taylor series expansions with the least squares method. To validate the effectiveness of our approach, we conducted analyses on accuracy, dispersion, and stability, alongside simple and complex numerical examples. The results indicate that our method not only inherits the capabilities of the original Taylor series and least squares methods in suppressing numerical dispersion across small and medium wavenumber ranges but also surpasses the original methods. Moreover, it demonstrates robust dispersion suppression capabilities at high wavenumber ranges.
Read full abstract