ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 0901.1904
68
1

Joint universal lossy coding and identification of stationary mixing sources with general alphabets

13 January 2009
Maxim Raginsky
ArXivPDFHTML
Abstract

We consider the problem of joint universal variable-rate lossy coding and identification for parametric classes of stationary β\betaβ-mixing sources with general (Polish) alphabets. Compression performance is measured in terms of Lagrangians, while identification performance is measured by the variational distance between the true source and the estimated source. Provided that the sources are mixing at a sufficiently fast rate and satisfy certain smoothness and Vapnik-Chervonenkis learnability conditions, it is shown that, for bounded metric distortions, there exist universal schemes for joint lossy compression and identification whose Lagrangian redundancies converge to zero as Vnlog⁡n/n\sqrt{V_n \log n /n}Vn​logn/n​ as the block length nnn tends to infinity, where VnV_nVn​ is the Vapnik-Chervonenkis dimension of a certain class of decision regions defined by the nnn-dimensional marginal distributions of the sources; furthermore, for each nnn, the decoder can identify nnn-dimensional marginal of the active source up to a ball of radius O(Vnlog⁡n/n)O(\sqrt{V_n\log n/n})O(Vn​logn/n​) in variational distance, eventually with probability one. The results are supplemented by several examples of parametric sources satisfying the regularity conditions.

View on arXiv
Comments on this paper