In this paper, we propose and analyze stochastic proximal gradient methods
for minimizing a non-convex objective that consists of a smooth non-convex loss
and a non-smooth non-convex regularizer. Surprisingly, these methods are as
simple as those proposed for handling convex regularizers, and enjoy the same
complexities as those for solving convex regularized non-convex problems in
terms of finding an approximate stationary point. Our results improve upon
the-state-of-art results for solving non-smooth non-convex regularized problems
in (Xu et al., 2018a; Metel and Takeda, 2019). In addition, we extend our
results to stochastic proximal gradient with momentum methods such as
heavy-ball and Nesterov's accelerated gradient.