We prove that the $f$-divergences between univariate Cauchy distributions are always symmetric and can be expressed as functions of the chi-squared divergence. We explicit the corresponding functions for the total variation distance, the Kullback-Leibler divergence, the LeCam-Vincze divergence, the squared Hellinger divergence, and the Jensen-Shannon divergence. We then show that this symmetric $f$-divergence property does not hold anymore for multivariate Cauchy distributions. Finally, we present several metrizations of $f$-divergences between univariate Cauchy distributions.
翻译:我们证明,单向康威分配之间的美元差异总是对称的,可以表现为奇差差的函数。我们明确了总差异距离、库尔贝克-利伯尔差异、莱卡姆-温茨差异、平方海灵差和詹森-桑诺差异的相应功能。然后我们表明,这种对称的美元差异属性不再用于多种差异的康威分配。最后,我们展示了多种非利差分配之间的美元差异。