On -divergences between Cauchy distributions

Abstract
We prove that the -divergences between univariate Cauchy distributions are always symmetric and can be expressed as strictly increasing functions of the chi-squared divergence. We report the corresponding functions for the total variation distance, the Kullback-Leibler divergence, the LeCam-Vincze divergence, the squared Hellinger divergence, the Taneja divergence, and the Jensen-Shannon divergence. We then show that this symmetric -divergence property does not hold anymore for multivariate Cauchy distributions. Finally, we present several metrizations of -divergences between univariate Cauchy distributions.
View on arXivComments on this paper