Abstract

We apply a derivative-free optimization method based on novel low-rank tensor methods to the problem of propagating fuzzy uncertainty through a continuous real-valued function. Adhering to Zadeh's extension principle, such a problem can be reformulated as a sequence of optimization problems over nested search spaces. The optimization method we use is based on a low-rank tensor approximation of the function sampled on a grid and a search for the minimal and maximal entries in this low-rank tensor. In contrast to classical fuzzy uncertainty propagation methods, such as the vertex method and the transformation method, the method we propose does not exhibit an inherent exponential scaling for increasing dimension of the search space. Obviously, no derivative-free optimization algorithm can exist which shows sub-exponential scaling with the dimension for all possible continuous functions. The algorithm that we present here, however, can exploit a specific type of structure and regularity (beyond continuity) that is often present in real-world optimization problems. We illustrate this with some high-dimensional numerical examples where the presented method clearly outperforms some established derivative-free optimization codes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call