Abstract

In this article, we present an efficient algorithm for solving a class of chance-constrained optimization under nonparametric uncertainty. Our algorithm is built on the possibility of representing arbitrary distributions as functions in Reproducing Kernel Hilbert Space (RKHS). We use this foundation to formulate chance-constrained optimization as one of minimizing the distance between a desired distribution and the distribution of the constraint functions in the RKHS. We provide a systematic way of constructing the desired distribution based on the notion of scenario approximation. Furthermore, we use the kernel trick to show that the computational complexity of our reformulated optimization problem is comparable to solving a deterministic variant of the chance-constrained optimization. We validate our formulation on two important robotic applications: 1) reactive collision avoidance of mobile robots in uncertain dynamic environments and 2) inverse-dynamics-based path-tracking of manipulators under perception uncertainty. In both these applications, the underlying chance constraints are defined over nonlinear and nonconvex functions of uncertain parameters and possibly also decision variables. We also benchmark our formulation with the existing approaches in terms of sample complexity and the achieved optimal cost highlighting significant improvements in both these metrics.

Highlights

  • C ONSIDER the following optimization problem in terms of a scalar variable u: min g(u)(1a) pc(u) ≥ η (1b) u∈F (1c) pc(u) = P( f (w1, w2, u) ≤ 0). (2)Manuscript received December 22, 2019; revised July 1, 2020 and December 16, 2020; accepted February 17, 2021

  • The following are our key assumptions: 1) We assume that the uncertainty is nonparametric, which, in our case, means that the probability distribution functions associated with w1 and w2 are not known

  • The sample size does not vary with η. This is because the samples of the uncertain parameters are used to obtain an estimate of E[ f (w1, w2, u)] and (Var[ f (w1, w2, u)])1/2, and importantly, this estimation is independent of η. 4) As can be seen from Table II, our proposed formulation based on Reproducing Kernel Hilbert Space (RKHS) embedding has significantly better sample complexity than all the above-discussed approaches

Read more

Summary

INTRODUCTION

C ONSIDER the following optimization problem in terms of a scalar variable u: min g(u). We consider noise arising from both perception and ego-motion, and the chance constraints are formulated to ensure that the probability of collision avoidance is above the specified threshold. The motivation for this application stems from the fact that the prediction in dynamic environments (e.g., neighboring vehicles in autonomous driving) would always have some uncertainty associated with it. The manipulator should compute the necessary torque commands for path-tracking while considering the state estimation uncertainty to ensure that the probability of exerting a torque that violates the specified bounds is under some threshold This requirement can be naturally put in the form.

Computational Challenge
Key Idea and Motivation for RKHS Embedding
Contribution
Moment Matching Problem
Moment Matching in the RKHS
Reduced Set Methods
MAIN RESULTS
Overview
Algebraic Form of the Constraint Function
Desired Distribution
Chance-Constrained Optimization as a Moment Matching Problem
Simplification Based on Kernel Trick
APPLICATIONS
Dynamic Obstacle Avoidance Along a Given Path
Inverse-Dynamics-Based Path-Tracking
RESULTS
Collision Avoidance Results
Path-Tracking Results for a Two-Link Manipulator
Consistency and Sample Complexity
CONCLUSION AND FUTURE WORK
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call