Partitioning a set of elements into a given number of groups of a priori unknown sizes is a critical task in many applications. It can be characterized by a hypergeometric distribution, which describes biased sampling without replacement based on the relative importance between classes of samples. Due to hard constraints, this discrete distribution is not differentiable in its standard formulation, prohibiting its use in modern machine learning frameworks. Hence, previous works mostly fall back on suboptimal heuristics or simplified assumptions. In this work, we propose a differentiable reparameterization trick for the multivariate noncentral hypergeometric distribution. We introduce reparameterizable gradients to enable learning of the importance or the selection bias between groups. We highlight the applicability and usability of the proposed formulation in two different experiments: weakly-supervised learning and clustering.
翻译:将一组元素分割成一定数量的先天未知大小组群是许多应用中的一项关键任务。 其特征可以是超几何分布法, 它描述偏差抽样,而没有根据不同类别样本之间的相对重要性进行替换。 由于困难,这种离散分布在标准配方中是无法区分的,禁止将其用于现代机器学习框架。 因此, 以往的工程大多依赖于亚于最优超常或简化的假设。 在这项工作中, 我们为多变非中度超几何分布法提出了一种可区别的重新校准技巧。 我们引入了可重新校准的梯度, 以便能够了解各组间的重要性或选择偏差。 我们强调拟议配方在两种不同的实验中的适用性和可用性: 低监督的学习和组合。