Primal-dual gradient dynamics that find saddle points of a Lagrangian have
been widely employed for handling constrained optimization problems. Building
on existing methods, we extend the augmented primal-dual gradient dynamics to
incorporate general convex and nonlinear inequality constraints, and we
establish its semi-global exponential stability when the objective function has
a quadratic gradient growth. Numerical simulation also suggests that the
exponential convergence rate could depend on the initial distance to the KKT
point.