ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.10171
29
0

Kullback-Leibler excess risk bounds for exponential weighted aggregation in Generalized linear models

14 April 2025
Tien Mai
    FedML
ArXivPDFHTML
Abstract

Aggregation methods have emerged as a powerful and flexible framework in statistical learning, providing unified solutions across diverse problems such as regression, classification, and density estimation. In the context of generalized linear models (GLMs), where responses follow exponential family distributions, aggregation offers an attractive alternative to classical parametric modeling. This paper investigates the problem of sparse aggregation in GLMs, aiming to approximate the true parameter vector by a sparse linear combination of predictors. We prove that an exponential weighted aggregation scheme yields a sharp oracle inequality for the Kullback-Leibler risk with leading constant equal to one, while also attaining the minimax-optimal rate of aggregation. These results are further enhanced by establishing high-probability bounds on the excess risk.

View on arXiv
@article{mai2025_2504.10171,
  title={ Kullback-Leibler excess risk bounds for exponential weighted aggregation in Generalized linear models },
  author={ Tien Mai },
  journal={arXiv preprint arXiv:2504.10171},
  year={ 2025 }
}
Comments on this paper