Menu
Contact | A-Z
img

Thursday, December 4, 2025 - 16:15 in V2-210/216


The Statistical Foundations Deep Learning Still Needs

A talk in the Oberseminar Probability Theory and Mathematical Statistics series by
Sophie Langer from Universität Bochum

Abstract: Since several years, deep learning has emerged as a transformative field, with its theory involving several disciplines such as approximation theory, statistics and optimization. Despite remarkable advances, the rapid evolution of AI-driven methods continually outpaces our theoretical understanding. New challenges, from overparametrization and diffusion models to Transformer learning arise almost yearly, underscoring the gap between theory and practice.

In this talk, we explore recent theoretical breakthroughs with a particular emphasis on statistical insights. We critically examine prevailing statistical frameworks and proof strategies and address their limitations by (1) providing statistical results for estimators trained via gradient descent, rather than analyzing empirical risk minimizers, thereby better aligning statistical analysis with optimization practice; and (2) introducing a new statistical prediction framework for image classification. Together, these results represent first steps toward rethinking statistical theory in the deep-learning era and, we hope, help pave the way for further advances in this direction

Within the CRC this talk is associated to the project(s): B10



Back

© 2017–Present Sonderforschungbereich 1283 | Imprint | Privacy Policy