On -divergences between Cauchy distributions

We prove that the -divergences between univariate Cauchy distributions are all symmetric, and can be expressed as strictly increasing scalar functions of the symmetric chi-squared divergence. We report the corresponding scalar functions for the total variation distance, the Kullback-Leibler divergence, the squared Hellinger divergence, and the Jensen-Shannon divergence among others. Next, we give conditions to expand the -divergences as converging infinite series of higher-order power chi divergences, and illustrate the criterion for converging Taylor series expressing the -divergences between Cauchy distributions. We then show that the symmetric property of -divergences holds for multivariate location-scale families with prescribed matrix scales provided that the standard density is even which includes the cases of the multivariate normal and Cauchy families. However, the -divergences between multivariate Cauchy densities with different scale matrices are shown asymmetric. Finally, we present several metrizations of -divergences between univariate Cauchy distributions and further report geometric embedding properties of these metrics.
View on arXiv