A modification of the limited-memory variable metric BNS method for large scale unconstrained optimization of the differentiable function is considered, which consists in corrections (based on the idea of conjugate directions) of difference vectors for better satisfaction of the previous quasi-Newton conditions. In comparison with [11], more previous iterations can be utilized here. For quadratic objective functions, the improvement of convergence is the best one in some sense, all stored corrected difference vectors are conjugate and the quasi-Newton conditions with these vectors are satisfied. The algorithm is globally convergent for convex sufficiently smooth functions and our numerical experiments indicate its efficiency.
@article{702689, title = {A modified limited-memory BNS method for unconstrained minimization derived from the conjugate directions idea}, booktitle = {Programs and Algorithms of Numerical Mathematics}, series = {GDML\_Books}, publisher = {Institute of Mathematics AS CR}, address = {Prague}, year = {2015}, pages = {237-243}, url = {http://dml.mathdoc.fr/item/702689} }
Vlček, Jan; Lukšan, Ladislav. A modified limited-memory BNS method for unconstrained minimization derived from the conjugate directions idea, dans Programs and Algorithms of Numerical Mathematics, GDML_Books, (2015), pp. 237-243. http://gdmltest.u-ga.fr/item/702689/