Talk from Archives

On the convergence of gradient-like flows with noisy gradient input (joint work with M. Staudigl)

12.12.2016 16:45 - 17:45

This talk examines the asymptotic behavior of gradient-like flows that are subject to random disturbances. Specifically, we focus on a broad class of "black-box" gradient descent schemes for constrained convex programming and we study the dynamics' trajectory convergence and ergodicity properties in the presence of noise. In the small noise limit, we show that the process converges to the solution set of the underlying problem (a.s.). Otherwise, if the noise is persistent, we estimate the dynamics' long-run concentration around interior solutions and the a.s. convergence of the method to "robust" solutions. Finally, we examine a suitably "rectified" variant which converges irrespective of the magnitude of the noise or the structure of the underlying convex program.

Homepage of Panayotis Mertikopoulos

Location:
Lecture Hall 12