Robust Asynchronous Stochastic Gradient-Push: Asymptotically Optimal and Network-Independent Performance for Strongly Convex Functions
Olshevsky, Alex ; Paschalidis, Ioannis Ch. ; Spiridonoff, Artin
arXiv, 1811.03982 / Harvested from arXiv
We study the standard model of distributed optimization of a sum of functions $F(z) = \sum_{i=1}^n f_i(z)$ where node $i$ in a network holds the function $f_i(z)$. We consider a harsh network model characterized by asynchronous updates, message delays, unpredictable message losses, and directed communication among nodes. In this setting, we analyze a modification of the Gradient-Push method for distributed optimization, assuming that (i) node $i$ is capable of generating gradients of its function $f_i(z)$ corrupted by zero-mean bounded-support additive noise at each step, (ii) $F(z)$ is strongly convex, and (iii) each $f_i(z)$ has Lipschitz gradients. We show that our proposed method asymptotically performs as well as centralized gradient descent that takes steps in the direction of the sum of the noisy gradients of all the functions $f_1(x), \ldots, f_n(x)$ at each step.
Publié le : 2018-11-09
Classification:  Mathematics - Optimization and Control
@article{1811.03982,
     author = {Olshevsky, Alex and Paschalidis, Ioannis Ch. and Spiridonoff, Artin},
     title = {Robust Asynchronous Stochastic Gradient-Push: Asymptotically Optimal and
  Network-Independent Performance for Strongly Convex Functions},
     journal = {arXiv},
     volume = {2018},
     number = {0},
     year = {2018},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1811.03982}
}
Olshevsky, Alex; Paschalidis, Ioannis Ch.; Spiridonoff, Artin. Robust Asynchronous Stochastic Gradient-Push: Asymptotically Optimal and
  Network-Independent Performance for Strongly Convex Functions. arXiv, Tome 2018 (2018) no. 0, . http://gdmltest.u-ga.fr/item/1811.03982/