Abstract

A real matrix A=∥a ik ∥ (i=1, m; k= 1,..., n) is said to be totally positive if all its minors, of any order, are non-negative. In 19302) the author showed that if A is totally positive, then the linear transformation $$ {y_i} = \sum\limits_{k = 1}^n {{a_{ik}}{x_k}} \quad \left( {i = 1, \ldots ,m} \right) $$ (1) is variation-diminishing in the sense that if v(x k ) denotes the number of variations of sign in the sequence x k and v(y i ) the corresponding number in the sequence y i , then we always have the inequality v(y i ) ≦v(x k ). In the same paper of 1930 the author showed that (1) is certainly variationdiminishing if the matrix A does not possess two minors of equal orders and of opposite signs; also the converse holds to a certain extent: If (1) is variation-diminishing, then A cannot have two minors of equal orders and of opposite signs, provided the rank of A is = n. The necessary and sufficient conditions in order that (1) be variation-diminishing were found in 1933 by Th. Motzkin 3). Since they will be used in this paper we state them here as follows: Let r be the rank of A then A should not have two minors of equal orders and of opposite signs if their common order is < r, while if their common order is = r then again they should never be of opposite signs if they belong to the same combination of r columns of A.KeywordsOpposite SignFrequency FunctionReal ZeroDirect PartConvolution TypeThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call