Speaker: Roberto Imbuzeiro Oliveira
Date: Aug 20, 2020 at 15:00
Abstract: A natural approach to understand overparameterized deep neural networks is to ask if there is some kind of natural limiting behavior when the number of neurons diverges. We present a rigorous limit result of this kind for networks with complete connections and “random-feature-style” first and last layers. Specifically, we show that network weights are approximated by certain “ideal particles” whose distribution and dependencies are described by McKean-Vlasov mean-field model. We will present the intuition behind our approach; sketch some of the key technical challenges along the way; and connect our results to some of the recent literature on the topic.
Venue: Online via Zoom (if you are interested in the link please send a message to the organizers through the contact form)