Menu
Contact | A-Z
img

Thursday, November 21, 2024 - 16:00 in T2-234


A Geometrical Analysis of Kernel Ridge Regression and its Applications

A talk in the Oberseminar Probability Theory and Mathematical Statistics series by
Zong Shang from CREST Paris

Abstract: We obtain upper bounds for the estimation error of Kernel Ridge Regression (KRR) across all non-negative regularization parameters, providing a geometric perspective on various phenomena in KRR. As applications: 1. We address the Multiple Descents problem, unifying the proofs of [LRZ20] and [GMMM21] for polynomial kernels in the non-asymptotic regime, and we establish Multiple Descents for KRR’s generalization error for polynomial kernels under sub-Gaussian design in asymptotic regimes. 2. In the non-asymptotic setting, we establish a one-sided isomorphic version of the Gaussian Equivalence Conjecture for sub-Gaussian design vectors. 3. We offer a novel perspective on the linearization of kernel matrices for non-linear kernels, extending this to the power regime for polynomial kernels. 4. Our theory is applicable to data-dependent kernels, providing an effective tool for understanding feature learning in deep learning. 5. Our theory extends the results in [TB23] under weaker moment assumptions.

Our proofs leverage three mathematical tools developed in this work that may also be of independent interest: 1. A Dvoretzky-Milman theorem for ellipsoids under weak moment assumptions, 2. The Restricted Isomorphic Property in Reproducing Kernel Hilbert Spaces with embedding index conditions, 3. A concentration inequality for finite-degree polynomial kernel functions.

The associated paper can be found at https://arxiv.org/abs/2404.07709 [1]. This work is joint with Georgios Gavrilopoulos and Guillaume Lecué.



Back

© 2017–Present Sonderforschungbereich 1283 | Imprint | Privacy Policy