Friday, July 12, 2024 - 16:15 in V2-205
On Riemannian Optimization, Propagation of Chaos, and their Connections to Machine Learning
A talk in the CRC Seminar series by
Sebastian Kassing from Uni Bielefeld
Abstract: |
In this talk, we consider machine learning optimization problems that are posed on a Euclidean space or a Riemannian manifold. We define the (Riemannian) stochastic gradient descent process and describe its dynamical behavior in the small learning rate regime via an SDE. We then prove quantitative bounds for the weak error of the diffusion approximation under assumptions on the geometry of the manifold and the random estimators of the gradient. For a supervised learning task with a shallow artificial neural network in the Euclidean space we also describe the optimization dynamics in the small learning rate - infinite width scaling regime. These ideas connect the field of stochastic optimization with SDEs on manifolds, Riemannian geometry and interacting particle systems. Within the CRC this talk is associated to the project(s): B8 |
Back