Stochastic incremental mirror descent algorithms with Nesterov smoothing

Speaker: Professor  Sorin-Mihai Grad

Department of Applied Mathematics, ENSTA Paris, France

Date:  October 20,  2021 at 10:00 am (Chilean-time)

Title:   Stochastic incremental mirror descent algorithms with Nesterov smoothing

Abstract:  We propose a stochastic incremental mirror descent method constructed by means of the Nesterov smoothing for minimizing a sum of finitely many proper, convex and lower semicontinuous functions over a nonempty closed convex set in a Euclidean space. The algorithm can be adapted in order to minimize (in the same setting) a sum of finitely many proper, convex and lower semicontinuous functions composed with linear operators. Another modification of the scheme leads to a stochastic incremental mirror descent Bregman-proximal scheme with Nesterov smoothing for minimizing the sum of finitely many proper, convex and lower semicontinuous functions with a prox-friendly proper, convex and lower semicontinuous function in the same framework. Different to the previous contributions from the literature on mirror descent methods for minimizing sums of functions, we do not require these to be (Lipschitz) continuous or differentiable. Applications in Logistics, Tomography and Machine Learning modelled as optimization problems illustrate the theoretical achievements. The talk is based on joint work with Sandy Bitterlich.

Venue: Online via Google Meet: https://meet.google.com/bhn-ufcj-wnw

A brief biography of the speaker: Sorin-Mihai GRAD is a Professor of Optimization at ENSTA Paris, after working at the Chemnitz University of Technology (where he got his PhD), Leipzig University and University of Vienna. His research interests include convex, vector and numerical optimization as well as convex analysis and applications that can be modeled as such.

Coordinators: Fabián Flores-Bazán (CMM, Universidad de Concepción) and Abderrahim Hantoute (Alicante)