Optimality Implies Kernel Sum Classifiers are Statistically Efficient Article Swipe
Related Concepts
Sample complexity
Kernel (algebra)
Machine learning
Artificial intelligence
Multiple kernel learning
Random subspace method
Computer science
Kernel method
Mathematics
Pattern recognition (psychology)
Classifier (UML)
Support vector machine
Discrete mathematics
Raphael A. Meyer
,
Jean Honorio
·
YOU?
·
· 2019
· Open Access
·
· DOI: https://doi.org/10.48550/arxiv.1901.09087
· OA: W2947734325
YOU?
·
· 2019
· Open Access
·
· DOI: https://doi.org/10.48550/arxiv.1901.09087
· OA: W2947734325
We propose a novel combination of optimization tools with learning theory bounds in order to analyze the sample complexity of optimal kernel sum classifiers. This contrasts the typical learning theoretic results which hold for all (potentially suboptimal) classifiers. Our work also justifies assumptions made in prior work on multiple kernel learning. As a byproduct of our analysis, we also provide a new form of Rademacher complexity for hypothesis classes containing only optimal classifiers.
Related Topics
Finding more related topics…