Due to the curse of dimensionality and the limitation on training data, approximating high-dimensional functions is a very challenging task even for powerful deep neural networks. Inspired by the Nonlinear Level set Learning (NLL) method that uses the reversible residual network (RevNet), in this paper we propose a new method of Dimension Reduction via Learning Level Sets (DRiLLS) for function approximation. Our method contains two major components: one is the pseudo-reversible neural network (PRNN) module that effectively transforms high-dimensional input variables to low-dimensional active variables, and the other is the synthesized regression module for approximating function values based on the transformed data in the low-dimensional space. The PRNN not only relaxes the invertibility constraint of the nonlinear transformation present in the NLL method due to the use of RevNet, but also adaptively weights the influence of each sample and controls the sensitivity of the function to the learned active variables. The synthesized regression uses Euclidean distance in the input space to select neighboring samples, whose projections on the space of active variables are used to perform local least-squares polynomial fitting. This helps to resolve numerical oscillation issues present in traditional local and global regressions. Extensive experimental results demonstrate that our DRiLLS method outperforms both the NLL and Active Subspace methods, especially when the target function possesses critical points in the interior of its input domain.
翻译:由于维度的诅咒和对培训数据的限制,即使对强大的深神经网络来说,近似高维功能也是一项非常艰巨的任务。在使用可逆余量网络(RevNet)的非线性级级学习(NLL)方法的启发下,本文中我们提出了一种新的方法,即通过学习级网络(DRilLS)降低功能近距离的维度。我们的方法包含两个主要组成部分:一个是假可逆神经网络模块,它有效地将高维输入变量转换为低维活性变量,而另一个则是基于低维空间变换数据的近维性功能值综合回归模块。PRNNN不仅减轻了NLLT方法中存在的非线性转变的不可逆性限制,而且还从适应角度权衡了每个样本的影响,控制了函数对所学活跃变量的敏感性。综合回归在输入空间中使用 Euclideidean 距离来选择近维度样本, 其对以低维度空间目标值函数的合成回归模块化模块, 特别是当前空间传统空间内值的预测, 用于最不易变化的缩缩缩化方法,从而演示了当前空间的本地的缩化方法,从而演示了当前对地平地点的缩化了当前实验性结果,从而演示了当前对地基结果,从而演示了对地平地平地基结果进行了了对地平地平地基反向后,从而使用了。