ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2104.10637
41
10

Robust Kernel-based Distribution Regression

21 April 2021
Zhan Yu
D. Ho
Ding-Xuan Zhou
    OOD
ArXivPDFHTML
Abstract

Regularization schemes for regression have been widely studied in learning theory and inverse problems. In this paper, we study distribution regression (DR) which involves two stages of sampling, and aims at regressing from probability measures to real-valued responses over a reproducing kernel Hilbert space (RKHS). Recently, theoretical analysis on DR has been carried out via kernel ridge regression and several learning behaviors have been observed. However, the topic has not been explored and understood beyond the least square based DR. By introducing a robust loss function lσl_{\sigma}lσ​ for two-stage sampling problems, we present a novel robust distribution regression (RDR) scheme. With a windowing function VVV and a scaling parameter σ\sigmaσ which can be appropriately chosen, lσl_{\sigma}lσ​ can include a wide range of popular used loss functions that enrich the theme of DR. Moreover, the loss lσl_{\sigma}lσ​ is not necessarily convex, hence largely improving the former regression class (least square) in the literature of DR. The learning rates under different regularity ranges of the regression function fρf_{\rho}fρ​ are comprehensively studied and derived via integral operator techniques. The scaling parameter σ\sigmaσ is shown to be crucial in providing robustness and satisfactory learning rates of RDR.

View on arXiv
Comments on this paper