Title | For interpolating kernel machines, minimizing the norm of the ERM solution maximizes stability |
Publication Type | Journal Article |
Year of Publication | 2023 |
Authors | Rangamani, A, Rosasco, L, Poggio, T |
Journal | Analysis and Applications |
Volume | 21 |
Issue | 01 |
Pagination | 193 - 215 |
Date Published | 01/2023 |
ISSN | 0219-5305 |
Keywords | Algorithmic stability, high dimensional statistics, kernel regression, minimum norm interpolation, overparameterization |
Abstract | In this paper, we study kernel ridge-less regression, including the case of interpolating solutions. We prove that maximizing the leave-one-out () stability minimizes the expected error. Further, we also prove that the minimum norm solution — to which gradient algorithms are known to converge — is the most stable solution. More precisely, we show that the minimum norm interpolating solution minimizes a bound on stability, which in turn is controlled by the smallest singular value, hence the condition number, of the empirical kernel matrix. These quantities can be characterized in the asymptotic regime where both the dimension () and cardinality () of the data go to infinity (with as ). Our results suggest that the property of stability of the learning algorithm with respect to perturbations of the training set may provide a more general framework than the classical theory of Empirical Risk Minimization (ERM). While ERM was developed to deal with the classical regime in which the architecture of the learning network is fixed and , the modern regime focuses on interpolating regressors and overparameterized models, when both and go to infinity. Since the stability framework is known to be equivalent to the classical theory in the classical regime, our results here suggest that it may be interesting to extend it beyond kernel regression to other overparameterized algorithms such as deep networks. |
URL | https://www.worldscientific.com/doi/10.1142/S0219530522400115 |
DOI | 10.1142/S0219530522400115 |
Short Title | Anal. Appl. |
Associated Module:
CBMM Relationship:
- CBMM Funded