Understanding heterogeneous treatment effects (HTE) plays a key role in many contemporary causal inference applications arising from different areas. Most of the existing works have focused on the estimation of HTE. Yet the statistical inference aspect of the problem remains relatively undeveloped. In this paper we investigate the inference of HTE in a nonparametric setting for randomized experiments. We formulate the problem as two separate nonparametric mean regressions, one for control group and the other for treatment group. For each mean regression, we extend the tool of $k$-nearest neighbors to the framework of distributional nearest neighbors (DNN). We show that the DNN estimator has two equivalent representations of L-statistic and U-statistic, where the former endorses easy and fast implementation, and the latter enables us to obtain higher-order asymptotic expansion of bias and establish the asymptotic normality. To reduce the finite sample bias of DNN, we further suggest a new method of two-scale distributional nearest neighbors (TDNN). Under some regularity conditions, we show through delicate higher-order asymptotic expansions that the TDNN heterogeneous treatment effect estimator is asymptotically normal. We further establish the consistency of the variance estimates of the TDNN estimator with both jackknife and bootstrap, enabling user-friendly inference tools for heterogeneous treatment effects. The theoretical results and appealing finite-sample performance of the suggested TDNN method are illustrated with several simulation examples and a children's birth weight application.
翻译:理解不同治疗效果(HTE)在当代不同领域产生的许多因果模拟应用中发挥着关键作用。现有工作大多侧重于估算HTE。然而,这一问题的统计推论方面仍然相对未开发。在本文件中,我们调查了在非参数环境下随机实验中HTE的推论。我们将这一问题表述为两种单独的非参数平均回归,一种用于控制组,另一种用于治疗组。对于每一种平均回归,我们将最近邻的美元最远邻国工具推广到分布式近邻框架(DNN) 。我们表明,DNNE估计值有两个等同的L-统计学和U统计学的表示,前者赞同容易和快速执行,而后者使我们能够在非参数上得到更高排序的偏差扩展,并建立了无偏差的正常状态。为了减少DNNNN的有限抽样偏差,我们进一步建议采用一种双尺度分布式近邻(TDNNNNN)的新的方法。在某些常规条件下,我们通过微妙的更高阶梯度的推算方法展示了LSentestrial-real desistral Ex偏差性处理结果,我们展示了Simestalimestalestalestalestal degradustration delviduction delviews delviewds TDDDDDDNDNDDDDDDDDDDDDDDDDDDS