Info: Zenodo’s user support line is staffed on regular business days between Dec 23 and Jan 5. Response times may be slightly longer than normal.

Published July 12, 2019 | Version v1
Preprint Open

Distributed Zeroth Order Optimization Over Random Networks: A Kiefer-Wolfowitz Stochastic Approximation Approach

  • 1. University of Novi Sad Faculty of Sciences
  • 2. University of Novi Sad Faculty of Technical Sciences
  • 3. Carnegie Mellon University

Description

We study a standard distributed optimization framework where N networked nodes collaboratively minimize the sum of their local convex costs. The main body of existing work considers the described problem when the underling network is either static or deterministically varying, and the distributed optimization algorithm is of first or second order, i.e., it involves the local costs’ gradients and possibly the local Hessians. In this paper, we consider the currently understudied but highly relevant scenarios when: 1) only noisy function values’ estimates are available (no gradients nor Hessians can be evaluated); and 2) the underlying network is randomly varying (according to an independent, identically distributed process). For the described random networks-zeroth order optimization setting, we develop a distributed stochastic approximation method of the Kiefer-Wolfowitz type. Furthermore, under standard smoothness and strong convexity assumptions on the local costs, we establish the O(1/sqrt(k) ) mean square convergence rate for the method – the rate that matches that of the method’s centralized counterpart under equivalent conditions.

Notes

preprint of a paper accepted at IEEE CDC 2018 (https://cdc2018.ieeecss.org/)

Files

KWalgorithm.pdf

Files (485.8 kB)

Name Size Download all
md5:947ca15c7737a1d0a60643f7d563ded6
485.8 kB Preview Download

Additional details

Related works

Is supplemented by
10.5281/zenodo.3333691 (DOI)

Funding

I-BiDaaS – Industrial-Driven Big Data as a Self-Service Solution 780787
European Commission