Abstract

We present an efficient optimization framework that solves trajectory optimization problems by decoupling state variables from timing variables, thereby decomposing a challenging nonlinear programming (NLP) problem into two easier subproblems. With timing fixed, the state variables can be optimized efficiently using convex optimization, and the timing variables can be optimized in a separate NLP, which forms a bilevel optimization problem. The challenge is to obtain the gradient of the objective function which itself needs an optimization to compute. Whereas finite differences must solve many optimization problems to compute the gradient, our method is based on sensitivity analysis of parametric programming: the dual solution (Lagrange multipliers) of the lower-level optimization is used to compute analytical gradients. Since the dual solution is a by-product of the optimization, the exact gradients can be obtained "for free". The framework is demonstrated on generating trajectories in safe corridors for an unmanned aerial vehicle. Experiments demonstrate that bilevel optimization converges significantly more reliably than a standard NLP solver, and analytical gradients outperform finite differences in terms of computation speed and accuracy. With a 25ms cutoff time, our approach achieves over 8 times better suboptimality than the current state-of-the-art.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call