We discuss abstract Newtonian frameworks for generalized equations, and how a number of important algorithms for constrained optimization can be related to them by introducing structured perturbations to the basic Newton iteration. This gives useful tools for local convergence and rate-of-convergence analysis of various algorithms from unified perspectives, often yielding additional insight and also sharper results than provided by other approaches. Specific constrained optimization algorithms, that can be conveniently analyzed within perturbed Newtonian frameworks, include the sequential quadratic programming method and its various modifications (truncated, augmented Lagrangian, composite step, stabilized, and equipped with second-order corrections), the linearly constrained Lagrangian methods, inexact restoration, sequential quadratically constrained quadratic programming, and certain interior feasible directions methods. We recall some of those algorithms as examples, to illustrate the underlying viewpoint. We also discuss how the main ideas of this approach go beyond clearly Newton-related methods and are applicable, for example, to the augmented Lagrangian algorithm (also known as the method of multipliers), which in principle does not look of Newton type since its iterations do not approximate any part of the problem data
Homepage of Mikhail Solodov