ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1411.3427
79
14
v1v2 (latest)

Two-sample Bayesian nonparametric goodness-of-fit test

13 November 2014
Luai Al Labadi
E. Masuadi
M. Zarepour
ArXiv (abs)PDFHTML
Abstract

Testing the difference between two data samples is of a particular interest in statistics. Precisely, given two samples X=X1,…,Xm1∼i.i.d.F\mathbf{X}=X_1,\ldots,X_{m_1} \overset {i.i.d.} \sim FX=X1​,…,Xm1​​∼i.i.d.F and Y=Y1,…,Ym2∼i.i.d.G\mathbf{Y}=Y_1,\ldots,Y_{m_2} \overset {i.i.d.} \sim GY=Y1​,…,Ym2​​∼i.i.d.G, with FFF and GGG being unknown continuous cumulative distribution functions, we wish to test the null hypothesis H0: F=G\mathcal{H}_0:~F=GH0​: F=G. In this paper, we propose an effective and convenient Bayesian nonparametric approach to assess the equality of two unknown distributions. The method is based on the Kolmogorov distance and approximate samples from the Dirichlet process centered at the standard normal distribution and a concentration parameter 1. Our results show that the proposed test is robust with respect to any prior specification of the Dirichlet process. We provide simulated examples to illustrate the workings of the method. Overall, the proposed method performs perfectly in many cases.

View on arXiv
Comments on this paper