Abstract

An $n\times n$ matrix is said to have a self-interlacing spectrum if its eigenvalues $\lambda_k$, $k=1,\ldots,n$, are distributed as follows: $$ \lambda_1>-\lambda_2>\lambda_3>\cdots>(-1)^{n-1}\lambda_n>0. $$ A method for constructing sign definite matrices with self-interlacing spectrum from totally nonnegative ones is presented. This method is applied to bidiagonal and tridiagonal matrices. In particular, a result by O. Holtz on the spectrum of real symmetric anti-bidiagonal matrices with positive nonzero entries is generalized.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.