ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1006.1129
38
19

Predictive PAC learnability: a paradigm for learning from exchangeable input data

6 June 2010
V. Pestov
ArXivPDFHTML
Abstract

Exchangeable random variables form an important and well-studied generalization of i.i.d. variables, however simple examples show that no nontrivial concept or function classes are PAC learnable under general exchangeable data inputs X1,X2,…X_1,X_2,\ldotsX1​,X2​,…. Inspired by the work of Berti and Rigo on a Glivenko--Cantelli theorem for exchangeable inputs, we propose a new paradigm, adequate for learning from exchangeable data: predictive PAC learnability. A learning rule L\mathcal LL for a function class F\mathscr FF is predictive PAC if for every \e,δ>0\e,\delta>0\e,δ>0 and each function f∈Ff\in {\mathscr F}f∈F, whenever \absσ≥s(δ,\e)\abs{\sigma}\geq s(\delta,\e)\absσ≥s(δ,\e), we have with confidence 1−δ1-\delta1−δ that the expected difference between f(Xn+1)f(X_{n+1})f(Xn+1​) and the image of f∣σf\vert\sigmaf∣σ under L\mathcal LL does not exceed \e\e\e conditionally on X1,X2,…,XnX_1,X_2,\ldots,X_nX1​,X2​,…,Xn​. Thus, instead of learning the function fff as such, we are learning to a given accuracy \e\e\e the predictive behaviour of fff at the future points Xi(ω)X_i(\omega)Xi​(ω), i>ni>ni>n of the sample path. Using de Finetti's theorem, we show that if a universally separable function class F\mathscr FF is distribution-free PAC learnable under i.i.d. inputs, then it is distribution-free predictive PAC learnable under exchangeable inputs, with a slightly worse sample complexity.

View on arXiv
Comments on this paper