Abstract

Linear eigenvalue analysis has provided a fundamental framework for many scientific and engineering disciplines. Consequently, vast research was devoted to numerical schemes for computing eigenfunctions. In recent years, new research in image processing and machine-learning has shown the applicability of nonlinear eigenvalue analysis, specifically based on operators induced by convex functionals. This has provided new insights, better theoretical understanding and improved image-processing, clustering and classification algorithms. However, the theory of nonlinear eigenvalue problems is still very preliminary. We present a new class of nonlinear flows that can generate nonlinear eigenfunctions of the form $$T(u)=\lambda u$$ , where T(u) is a nonlinear operator and $$\lambda \in \mathbb {R} $$ is the eigenvalue. We develop the theory where T(u) is a subgradient element of a regularizing one-homogeneous functional, such as total-variation or total-generalized-variation. We focus on a forward flow which simultaneously smooths the solution (with respect to the regularizer) while increasing the 2-norm. An analog discrete flow and its normalized version are formulated and analyzed. The flows translate to a series of convex minimization steps. In addition we suggest an indicator to measure the affinity of a function to an eigenfunction and relate it to pseudo-eigenfunctions in the linear case.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.