Stochastic Proximal Gradient Methods for Non-smooth Non-Convex Regularized Problems
Xu, Yi ; Jin, Rong ; Yang, Tianbao
arXiv, Tome 2019 (2019) no. 0, / Harvested from
In this paper, we propose and analyze stochastic proximal gradient methods for minimizing a non-convex objective that consists of a smooth non-convex loss and a non-smooth non-convex regularizer. Surprisingly, these methods are as simple as those proposed for handling convex regularizers, and enjoy the same complexities as those for solving convex regularized non-convex problems in terms of finding an approximate stationary point. Our results improve upon the-state-of-art results for solving non-smooth non-convex regularized problems in (Xu et al., 2018a; Metel and Takeda, 2019). In addition, we extend our results to stochastic proximal gradient with momentum methods such as heavy-ball and Nesterov's accelerated gradient.
Publié le : 2019-02-20
Classification:  Mathematics - Optimization and Control
@article{1902.07672,
     author = {Xu, Yi and Jin, Rong and Yang, Tianbao},
     title = {Stochastic Proximal Gradient Methods for Non-smooth Non-Convex
  Regularized Problems},
     journal = {arXiv},
     volume = {2019},
     number = {0},
     year = {2019},
     language = {en},
     url = {http://dml.mathdoc.fr/item/1902.07672}
}
Xu, Yi; Jin, Rong; Yang, Tianbao. Stochastic Proximal Gradient Methods for Non-smooth Non-Convex
  Regularized Problems. arXiv, Tome 2019 (2019) no. 0, . http://gdmltest.u-ga.fr/item/1902.07672/