Abstract

In this article, we consider optimization problems in which the sums of the largest eigenvalues of symmetric matrices are involved. Considered as functions of a symmetric matrix, the eigenvalues are not smooth once the multiplicity of the function is not single; this brings some difficulties to solve. For this, the function of the sums of the largest eigenvalues with affine matrix-valued mappings is handled through the application of the -Lagrangian theory. Such theory extends the corresponding conclusions for the largest eigenvalue function in the literature. Inspired -space decomposition, the first- and second-order derivatives of -Lagrangian in the space of decision variables Rm are proposed when some regular condition is satisfied. Under this condition, we can use the vectors of -space to generate an implicit function, from which a smooth trajectory tangent to can be defined. Moreover, an algorithm framework with superlinear convergence can be presented. Finally, we provide an application about arbitrary eigenvalue which is usually a class of DC functions to verify the validity of our approach.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call