The old empathetic adage, ``Walk a mile in their shoes,'' asks that one imagine the difficulties others may face. This suggests a new ML counterfactual fairness criterion, based on a \textit{group} level: How would members of a nonprotected group fare if their group were subject to conditions in some protected group? Instead of asking what sentence would a particular Caucasian convict receive if he were Black, take that notion to entire groups; e.g. how would the average sentence for all White convicts change if they were Black, but with their same White characteristics, e.g. same number of prior convictions? We frame the problem and study it empirically, for different datasets. Our approach also is a solution to the problem of covariate correlation with sensitive attributes.
翻译:古老的同情格言, “ 一英里长的鞋,” ”, “ ” 要求想象其他人可能面临的困难。 这意味着基于\ textit{ group} 水平的新的 ML 反事实公平标准: 如果一个非受保护群体的成员的票价受到某些受保护群体的条件的约束, 他们将如何对待呢? 一个特定的高加索罪犯如果是黑人, 而不是问他会受到什么样的判决, 将这个概念带给整个群体; 比如, 如果所有白人罪犯是黑人, 他们的平均刑期会如何改变, 但是他们具有相同的白色特征, 比如相同的先前定罪数量? 我们为不同的数据集设计问题, 并用经验研究这个问题。 我们的方法也是解决与敏感属性的共变关系问题的方法 。