325

Semi-supervised Vector-valued Learning: From Theory to Algorithm

Pattern Recognition (Pattern Recognit.), 2019
Abstract

Vector-valued learning, where the output space admits a vector-valued structure, is an important problem that covers a broad family of important domains, e.g. multi-label learning and multi-class classification. Using local Rademacher complexity and unlabeled data, we derive novel data-dependent excess risk bounds for learning vector-valued functions in both the kernel space and linear space. The derived bounds are much sharper than existing ones, where convergence rates are improved from O(1/n)\mathcal{O}(1/\sqrt{n}) to O(1/n+u),\mathcal{O}(1/\sqrt{n+u}), and O(1/n)\mathcal{O}(1/n) in special cases. Motivated by our theoretical analysis, we propose a unified framework for learning vector-valued functions, incorporating both local Rademacher complexity and Laplacian regularization. Empirical results on a wide number of benchmark datasets show that the proposed algorithm significantly outperforms baseline methods, which coincides with our theoretical findings.

View on arXiv
Comments on this paper