Abstract

Method of moments (MoM) is a numerical technique used to approximately solve linear operator equations such as differential equations or integral equations. This chapter outlines the steps to approximate the unknown function in linear operator equations in terms of a known finite series. It then presents the Fourier series (FS) expansion of a continuous function in a specified finite region to reinforce the concept of minimizing error or residual. The basics of MoM are presented next. A few simple examples are included to better understand the MoM procedure. Finally, complex radiating and scattering problems that cannot be handled via analytical methods are simulated with MoM and compared with other numerical models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call