We prove that the $f$-divergences between univariate Cauchy distributions are always symmetric and can be expressed as strictly increasing functions of the chi-squared divergence. We report the corresponding functions for the total variation distance, the Kullback-Leibler divergence, the LeCam-Vincze divergence, the squared Hellinger divergence, the Taneja divergence, and the Jensen-Shannon divergence. We then show that this symmetric $f$-divergence property does not hold anymore for multivariate Cauchy distributions. Finally, we present several metrizations of $f$-divergences between univariate Cauchy distributions.
翻译:我们证明,单向卡瓦西分配物之间的美元差异总是对称的,可以表述为严格增加奇差差分的功能。我们报告总差异距离、库尔贝克-利贝尔差异、莱卡姆-温采差异、平方海灵差、塔尼哈差异和詹森-沙农差异的相应功能。然后我们表明,这种对称美元差异财产不再用于多变卡瓦西分配物。最后,我们提出了多种非利奥西分配物之间美元差异的计量。