ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1901.05078
  4. Cited By
On posterior contraction of parameters and interpretability in Bayesian
  mixture modeling

On posterior contraction of parameters and interpretability in Bayesian mixture modeling

15 January 2019
Aritra Guha
Nhat Ho
X. Nguyen
ArXivPDFHTML

Papers citing "On posterior contraction of parameters and interpretability in Bayesian mixture modeling"

10 / 10 papers shown
Title
Model-free Estimation of Latent Structure via Multiscale Nonparametric Maximum Likelihood
Model-free Estimation of Latent Structure via Multiscale Nonparametric Maximum Likelihood
Bryon Aragam
Ruiyi Yang
45
0
0
29 Oct 2024
Bayesian mixture models (in)consistency for the number of clusters
Bayesian mixture models (in)consistency for the number of clusters
Louise Alamichel
D. Bystrova
Julyan Arbel
Guillaume Kon Kam King
42
5
0
25 Oct 2022
Consistency of mixture models with a prior on the number of components
Consistency of mixture models with a prior on the number of components
Jeffrey W. Miller
25
71
0
06 May 2022
Selective inference for k-means clustering
Selective inference for k-means clustering
Yiqun T. Chen
Daniela Witten
25
43
0
29 Mar 2022
Beyond Black Box Densities: Parameter Learning for the Deviated
  Components
Beyond Black Box Densities: Parameter Learning for the Deviated Components
Dat Do
Nhat Ho
X. Nguyen
16
2
0
05 Feb 2022
MCMC computations for Bayesian mixture models using repulsive point
  processes
MCMC computations for Bayesian mixture models using repulsive point processes
Mario Beraha
R. Argiento
Jesper Møller
A. Guglielmi
13
17
0
12 Nov 2020
Optimal Bayesian estimation of Gaussian mixtures with growing number of
  components
Optimal Bayesian estimation of Gaussian mixtures with growing number of components
Ilsang Ohn
Lizhen Lin
31
17
0
17 Jul 2020
Task-Agnostic Online Reinforcement Learning with an Infinite Mixture of
  Gaussian Processes
Task-Agnostic Online Reinforcement Learning with an Infinite Mixture of Gaussian Processes
Mengdi Xu
Wenhao Ding
Jiacheng Zhu
Zuxin Liu
Baiming Chen
Ding Zhao
CLL
OffRL
21
34
0
19 Jun 2020
Estimating the Number of Components in Finite Mixture Models via the
  Group-Sort-Fuse Procedure
Estimating the Number of Components in Finite Mixture Models via the Group-Sort-Fuse Procedure
Tudor Manole
Abbas Khalili
27
19
0
24 May 2020
Robust estimation of mixing measures in finite mixture models
Robust estimation of mixing measures in finite mixture models
Nhat Ho
X. Nguyen
Yaácov Ritov
29
9
0
23 Sep 2017
1