Abstract
This chapter focuses on matrix calculus of two matrices. It discusses the importance of polynomials and exponentials to calculus and differential equations. In using matrices to solve linear differential equation, one needs to have basic concepts of polynomials and exponentials, and developing techniques for calculating the matrix functions. Polynomials and exponentials of matrices play an equally important role in matrix calculus and matrix differential equations. As a matrix commutes with itself, many of the properties of polynomials, such as addition, subtraction, multiplication, and factoring but not division, are still valid for polynomials of a matrix. A sequence {Bk} of matrices, Bk = [bijk], is said to converge to a matrix B = [bij] if the elements bijk converge to bij for each i and j. In general, it is very difficult to compute functions of matrices from their definition as infinite series; one exception is the diagonal matrix. The Cayley–Hamilton theorem, however, provides a starting point for the development of an alternate, straightforward method for calculating these functions. A very important function in the matrix calculus is eAt, where A is a square constant matrix, that is, all of its entries are constants, and t is a variable.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.