Tight bounds for minimum l1-norm interpolation of noisy data Article Swipe
Related Concepts
Overfitting
Norm (philosophy)
Mathematics
Interpolation (computer graphics)
Isotropy
Sigma
Upper and lower bounds
Applied mathematics
Consistency (knowledge bases)
Combinatorics
Discrete mathematics
Mathematical analysis
Computer science
Artificial intelligence
Physics
Quantum mechanics
Political science
Motion (physics)
Artificial neural network
Law
Guillaume Wang
,
Konstantin Donhauser
,
Fanny Yang
·
YOU?
·
· 2021
· Open Access
·
· DOI: https://doi.org/10.48550/arxiv.2111.05987
· OA: W3213893722
YOU?
·
· 2021
· Open Access
·
· DOI: https://doi.org/10.48550/arxiv.2111.05987
· OA: W3213893722
We provide matching upper and lower bounds of order $σ^2/\log(d/n)$ for the prediction error of the minimum $\ell_1$-norm interpolator, a.k.a. basis pursuit. Our result is tight up to negligible terms when $d \gg n$, and is the first to imply asymptotic consistency of noisy minimum-norm interpolation for isotropic features and sparse ground truths. Our work complements the literature on "benign overfitting" for minimum $\ell_2$-norm interpolation, where asymptotic consistency can be achieved only when the features are effectively low-dimensional.
Related Topics
Finding more related topics…