Abstract

A recent nonsmooth vector forward mode of algorithmic differentiation (AD) computes Nesterov's L-derivatives for nonsmooth composite functions; these L-derivatives provide useful sensitivity information to methods for nonsmooth optimization and equation solving. The established reverse AD mode evaluates gradients efficiently for smooth functions, but it does not extend directly to nonsmooth functions. Thus, this article examines branch-locking strategies to harness the benefits of smooth AD techniques even in the nonsmooth case, in order to improve the computational performance of the nonsmooth vector forward AD mode. In these strategies, each nonsmooth elemental function in the original composition is ‘locked’ into an appropriate linear ‘branch’. The original composition is thereby replaced with a smooth variant, which may be subjected to efficient AD techniques for smooth functions such as the reverse AD mode. In order to choose the correct linear branches, we use inexpensive probing steps to ascertain the composite function's local behaviour. A simple implementation in is described, and the developed techniques are extended to nonsmooth local implicit functions and inverse functions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call