Stochastic gradient descent (SGD) and projected stochastic gradient descent (PSGD) are scalable algorithms to compute model parameters in unconstrained and constrained optimization problems. In comparison with stochastic gradient descent (SGD), PSGD forces its iterative values into the constrained parameter space via projection. The convergence rate of PSGD-type estimates has been exhaustedly studied, while statistical properties such as asymptotic distribution remain less explored. From a purely statistical point of view, this paper studies the limiting distribution of PSGD-based estimate when the true parameters satisfying some linear-equality constraints. Our theoretical findings reveal the role of projection played in the uncertainty of the PSGD estimate. As a byproduct, we propose an online hypothesis testing procedure to test the linear-equality constraints. Simulation studies on synthetic data and an application to a real-world dataset confirm our theory.
翻译:与随机梯度下降(SGD)相比,PSGD将其迭代值通过投影将迭代值推入受限参数空间。PSGD型估计数的趋同率已经用尽,而诸如烟雾分布等统计特性的探索仍然较少。从纯粹统计的角度来看,本文件研究的是,在满足某些线性平等限制的真正参数时,限制基于PSGD的估计数的分布。我们的理论研究结果揭示了预测在PSGD估计数不确定性中所起的作用。作为一个副产品,我们提议了一个在线假设测试程序,以测试线性平等限制。合成数据模拟研究和对真实世界数据集的应用证实了我们的理论。