Convolutional neural network (CNN) is one of the most widely-used successful architectures in the era of deep learning. However, the high-computational cost of CNN still hampers more universal uses to light devices. Fortunately, the Fourier transform on convolution gives an elegant and promising solution to dramatically reduce the computation cost. Recently, some studies devote to such a challenging problem and pursue the complete frequency computation without any switching between spatial domain and frequent domain. In this work, we revisit the Fourier transform theory to derive feed-forward and back-propagation frequency operations of typical network modules such as convolution, activation and pooling. Due to the calculation limitation of complex numbers on most computation tools, we especially extend the Fourier transform to the Laplace transform for CNN, which can run in the real domain with more relaxed constraints. This work more focus on a theoretical extension and discussion about frequency CNN, and lay some theoretical ground for real application.
翻译:革命神经网络(CNN)是深层次学习时代最广泛使用的成功建筑之一。然而,CNN的高计算成本仍然阻碍着对光装置的更普遍使用。幸运的是,Fourier的革命变迁为大幅降低计算成本提供了优雅和有希望的解决方案。最近,一些研究专门研究这样一个具有挑战性的问题,在不改变空间领域和频繁领域的情况下进行全频率计算。在这项工作中,我们重新审视Fourier变换理论,以获得典型网络模块(如 Convolution、激活和集合)的进料和回推进频率操作。由于对大多数计算工具的复杂数字的计算限制,我们特别将Fourier变换扩展至Laplace的CNN变换,该变换可以在实际领域进行,但限制更加宽松。这项工作更侧重于理论扩展和CNN频率的讨论,并为实际应用打下一些理论基础。