next Talk

Statistical breakthroughs and novel perspectives on deep learning theory

06.10.2025 16:45 - 17:45

 

Since several years, deep learning has emerged as a transformative field, with its theory involving several disciplines such as approximation theory, statistics and optimization. Despite remarkable advances, the rapid evolution of AI-driven methods continually outpaces our theoretical understanding. New challenges, from overparametrization and diffusion models to Transformer learning arise almost yearly, underscoring the gap between theory and practice.

In this talk, we delve into key theoretical breakthroughs, with a particular focus on statistical results. We critically question the prevailing frameworks and introduce a novel statistical approach to image analysis. Rather than treating images as high-dimensional data entities, our framework reconceptualized them as structured objects shaped by geometric deformations like shifts, scales, and orientations. The goal of the classification rule is then to learn the uninformative deformations, resulting in convergence rates with more favorable trade-offs between input dimension and sample size. This fresh perspective not only provides new guarantees for approximation and convergence in deep learning-based image classification but also redefines how we approach image analysis with the potential of broader applications to other learning tasks. We conclude by discussing emerging research directions and reflecting on the role of theory in the field.

Underlying paper: https://arxiv.org/abs/2206.02151

Personal website of Sophie Langer

 

 

Location:
HS 7 OMP1 (#1.303)