Learnability in Online Kernel Selection with Memory Constraint via Data-dependent Regret Analysis

Online kernel selection is a fundamental problem of online kernelthis http URLthis paper,we study online kernel selection with memory constraint in which the memory of kernel selection and online prediction procedures is limited to a fixed budget. An essential question is what is the intrinsic relationship among online learnability, memory constraint, and data complexity? To answer the question,it is necessary to show the trade-offs between regret and memorythis http URLwork gives a worst-case lower bound depending on the data size,and shows learning is impossible within a small memorythis http URLcontrast, we present distinct results by offering data-dependent upper bounds that rely on two data complexities:kernel alignment and the cumulative losses of competitivethis http URLpropose an algorithmic framework giving data-dependent upper bounds for two types of lossthis http URLthe hinge loss function,our algorithm achieves an expected upper bound depending on kernelthis http URLsmooth loss functions,our algorithm achieves a high-probability upper bound depending on the cumulative losses of competitivethis http URLalso prove a matching lower bound for smooth lossthis http URLresults show that if the two data complexities are sub-linear,then learning is possible within a small memorythis http URLalgorithmic framework depends on a new buffer maintaining framework and a reduction from online kernel selection to prediction with expert advice. Finally,we empirically verify the prediction performance of our algorithms on benchmark datasets.
View on arXiv@article{li2025_2407.00916, title={ Learnability in Online Kernel Selection with Memory Constraint via Data-dependent Regret Analysis }, author={ Junfan Li and Shizhong Liao }, journal={arXiv preprint arXiv:2407.00916}, year={ 2025 } }