Abstract

We consider a stochastic continuous submodular huge-scale optimization problem, which arises naturally in many applications such as machine learning. Due to high-dimensional data, the computation of the whole gradient vector can become prohibitively expensive. To reduce the complexity and memory requirements, we propose a stochastic block-coordinate gradient projection algorithm for maximizing continuous submodular functions, which chooses a random subset of gradient vector and updates the estimates along the positive gradient direction. We prove that the estimates of all nodes generated by the algorithm converge to some stationary points with probability 1. Moreover, we show that the proposed algorithm achieves the tight (pmin/2F⁎-ϵ) approximation guarantee after O(1/ϵ2) iterations for DR-submodular functions by choosing appropriate step sizes. Furthermore, we also show that the algorithm achieves the tight (γ2/1+γ2pminF⁎-ϵ) approximation guarantee after O(1/ϵ2) iterations for weakly DR-submodular functions with parameter γ by choosing diminishing step sizes.

Highlights

  • In this paper, we focus on the submodular function maximization, which has recently attracted significant attention in academia since submodularity is a crucial concept in combinatorial optimization

  • We have considered a stochastic optimization problem of continuous submodular functions, which is an important problem in many areas such as machine learning and social science

  • We proposed the stochastic block-coordinate gradient projection algorithm for maximizing submodular functions, which randomly chooses a subset of the approximate gradient vector

Read more

Summary

Introduction

We focus on the submodular function maximization, which has recently attracted significant attention in academia since submodularity is a crucial concept in combinatorial optimization. There exist many polynomial time algorithms for approximately maximizing the submodular functions with approximation guarantees, such as the local search and greedy algorithms [22,23,24,25] Despite this progress, these methods use the combinatorial techniques, which have some limitations [26]. The asynchronous coordinate decent methods are proposed in recent years [47, 48] Despite this progress, stochastic blockcoordinate gradient projection methods for maximizing submodular functions have barely been investigated. Stochastic blockcoordinate gradient projection methods for maximizing submodular functions have barely been investigated To fill this gap, we propose the stochastic block-coordinate gradient projection algorithm to solve stochastic continuous submodular optimization problems, which are introduced in [30].

Mathematical Background
Problem Formulation and Algorithm Design
Main Results
Performance Analysis
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call