Abstract

In this article, we determine the Eigen values and Eigen vectors of a square matrix by a new approach. This considers all the roots with their multiplicities are known, using only the simple matrix multiplication of a vector. This process does not even require matrix inversion.

Highlights

  • There are many algorithms to determine the Eigen values and Eigen vectors of a square matrix [1]-[4]

  • ( ) ∑ Proof is easy once it is noted that A= U ΛU −1 and = xr+1 A= r x1 ∑ λirui, x1 = ui since the Eigen vectors in general are unique upto scale and Eigen vectors associated with different Eigen values are linearly independent, the choice of x1 above is not restrictive; unless one is extremely unlucky, any vector, which may the arbitrarily chosen one, ( ) ( ) will be linear combination of all the ui ’s

  • THEOREM 2: The vector V =λ λ=* X nq=λ λ* is an Eigen vector of A associated with the Eigen values λ*

Read more

Summary

Introduction

There are many algorithms to determine the Eigen values and Eigen vectors of a square matrix [1]-[4]. When one has multiple roots, the usual iterative approaches generally fail to work, unless additional properties of Eigen vectors of repeated roots are exploited. It is theoretically possible (and in practice achievable with some success) to obtain all the Eigen vectors (including the generalized Eigen vectors connected with Jordan reduction) of a matrix if all the roots with their multiplicities are known using only the simple matrix multiplication of a vector. Rather we shall only state a relevant new theorem in matrix theory Implication of this theorem and its extensions in the general contexts are dealt with in a separate study.

Basic Points
Main Results
Illustrations
Illustration 2
Illustration 3
Illustration 4
Illustration 5
Case 1
3.10. Case 4
3.11. Case 5
Summary
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call