ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1503.07643
27
19

Asymptotic Properties of Bayesian Predictive Densities When the Distributions of Data and Target Variables are Different

26 March 2015
F. Komaki
ArXiv (abs)PDFHTML
Abstract

Bayesian predictive densities when the observed data xxx and the target variable yyy to be predicted have different distributions are investigated by using the framework of information geometry. The performance of predictive densities is evaluated by the Kullback--Leibler divergence. The parametric models are formulated as Riemannian manifolds. In the conventional setting in which xxx and yyy have the same distribution, the Fisher--Rao metric and the Jeffreys prior play essential roles. In the present setting in which xxx and yyy have different distributions, a new metric, which we call the predictive metric, constructed by using the Fisher information matrices of xxx and yyy, and the volume element based on the predictive metric play the corresponding roles. It is shown that Bayesian predictive densities based on priors constructed by using non-constant positive superharmonic functions with respect to the predictive metric asymptotically dominate those based on the volume element prior of the predictive metric.

View on arXiv
Comments on this paper