Abstract |
We study the problem of maximizing a continuous DR-submodular function that is not necessarily smooth. We prove that the continuous greedy algorithm achieves a [(1-1/e)OPT-ε] guarantee when the function is monotone and Hölder-smooth, meaning that it admits a Hölder-continuous gradient. For functions that are non-differentiable or non-smooth, we propose a variant of the mirror-prox algorithm that attains a [(1/2)OPT-ε] guarantee. We apply our algorithmic frameworks to robust submodular maximization and distributionally robust submodular maximization under Wasserstein ambiguity. In particular, the mirror-prox method applies to robust submodular maximization to obtain a single feasible solution whose value is at least [(1/2)OPT-ε]. For distributionally robust maximization under Wasserstein ambiguity, we deduce and work over a submodular-convex maximin reformulation whose objective function is Hölder-smooth, for which we may apply both the continuous greedy method and the mirror-prox method. This is joint work with Duksang Lee, a fifth-year Ph.D. student at KAIST Math, and Nam Ho-Nguyen from the University of Sydney. |