Abstract

Many point estimation problems in robotics, computer vision, and machine learning can be formulated as instances of the general problem of minimizing a sparse nonlinear sum-of-squares objective function. For inference problems of this type, each input datum gives rise to a summand in the objective function, and therefore performing online inference corresponds to solving a sequence of sparse nonlinear least-squares minimization problems in which additional summands are added to the objective function over time. In this paper, we present Robust Incremental least-Squares Estimation (RISE), an incrementalized version of the Powell's Dog-Leg numerical optimization method suitable for use in online sequential sparse least-squares minimization. As a trust-region method, RISE is naturally robust to objective function nonlinearity and numerical ill-conditioning and is provably globally convergent for a broad class of inferential cost functions (twice-continuously differentiable functions with bounded sublevel sets). Consequently, RISE maintains the speed of current state-of-the-art online sparse least-squares methods while providing superior reliability.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.