This paper introduces a Factor Augmented Sparse Throughput (FAST) model that utilizes both latent factors and sparse idiosyncratic components for nonparametric regression. The FAST model bridges factor models on one end and sparse nonparametric models on the other end. It encompasses structured nonparametric models such as factor augmented additive models and sparse low-dimensional nonparametric interaction models and covers the cases where the covariates do not admit factor structures. Via diversified projections as estimation of latent factor space, we employ truncated deep ReLU networks to nonparametric factor regression without regularization and to a more general FAST model using nonconvex regularization, resulting in factor augmented regression using neural network (FAR-NN) and FAST-NN estimators respectively. We show that FAR-NN and FAST-NN estimators adapt to the unknown low-dimensional structure using hierarchical composition models in nonasymptotic minimax rates. We also study statistical learning for the factor augmented sparse additive model using a more specific neural network architecture. Our results are applicable to the weak dependent cases without factor structures. In proving the main technical result for FAST-NN, we establish a new deep ReLU network approximation result that contributes to the foundation of neural network theory. Our theory and methods are further supported by simulation studies and an application to macroeconomic data.
翻译:本文介绍了一个因子增强的偏差分解模型(FAST),该模型利用潜在因素和稀疏的特异性分解元件来进行非参数回归。FAST 模型桥介因模型模型在一端,而在另一端则少见的非参数模型,它包含结构化的非参数模型,如因子增强添加添加模型和稀释的低维非参数互动模型,并涵盖共变体不接受系数结构的情况。通过对潜在要素空间的估计,我们利用深度的深ReLU网络进行多样化的预测,在不正规化的情况下将非参数回归作为非参数的深线性要素网络,并使用使用非康韦克斯正规化的更一般的FAST模型,导致使用神经网络网络(FAR-NN)和FAST-NNN(FAST-NNN)的精度模型放大回归系数模型。我们显示,FAR-NN和FAST-N(FAST-NNN)的测算算模型模型将适应未知的低度结构。我们用一个支持的宏观经济模型模型基础,为新的宏观经济模型基础。