Title | Implicit regularization with strongly convex bias: Stability and acceleration |
Publication Type | Journal Article |
Year of Publication | 2023 |
Authors | Villa, S, Matet, S, Vũ, BCông, Rosasco, L |
Journal | Analysis and Applications |
Volume | 21 |
Issue | 01 |
Pagination | 165 - 191 |
Date Published | -01/2023 |
ISSN | 0219-5305 |
Abstract | Implicit regularization refers to the property of optimization algorithms to be biased towards a certain class of solutions. This property is relevant to understand the behavior of modern machine learning algorithms as well as to design efficient computational methods. While the case where the bias is given by a Euclidean norm is well understood, implicit regularization schemes for more general classes of biases are much less studied. In this work, we consider the case where the bias is given by a strongly convex functional, in the context of linear models, and data possibly corrupted by noise. In particular, we propose and analyze accelerated optimization methods and highlight a trade-off between convergence speed and stability. Theoretical findings are complemented by an empirical analysis on high-dimensional inverse problems in machine learning and signal processing, showing excellent results compared to the state of the art. |
URL | https://www.worldscientific.com/doi/10.1142/S0219530522400139 |
DOI | 10.1142/S0219530522400139 |
Short Title | Anal. Appl. |
Associated Module:
CBMM Relationship:
- CBMM Related