ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1409.8185
39
6
v1v2v3 (latest)

Adaptive Low-Complexity Sequential Inference for Dirichlet Process Mixture Models

29 September 2014
Theodoros Tsiligkaridis
K. W. Forsythe
ArXiv (abs)PDFHTML
Abstract

We develop a sequential low-complexity inference procedure for Dirichlet process mixtures of Gaussians for online clustering and parameter estimation when the number of clusters are unknown a-priori. We present an easily computable, closed form parametric expression for the conditional likelihood, in which hyperparameters are recursively updated as a function of the streaming data assuming conjugate priors. Motivated by large-sample asymptotics, we propose a novel adaptive low-complexity design for the Dirichlet process concentration parameter and show that the number of classes grow at most at a logarithmic rate. We further prove that in the large-sample limit, the conditional likelihood becomes asymptotically Gaussian. We apply this methodology to the problem of adaptive signal modulation recognition in digital communications, showing asymptotic optimality of our method in terms of bit error rate performance.

View on arXiv
Comments on this paper