Quantum neural networks (QNNs) are an analog of classical neural networks in the world of quantum computing, which are represented by a unitary matrix with trainable parameters. Inspired by the universal approximation property of classical neural networks, ensuring that every continuous function can be arbitrarily well approximated uniformly on a compact set of a Euclidean space, some recent works have established analogous results for QNNs, ranging from single-qubit to multi-qubit QNNs, and even hybrid classical-quantum models. In this paper, we study the approximation capabilities of QNNs for periodic functions with respect to the supremum norm. We use the Jackson inequality to approximate a given function by implementing its approximating trigonometric polynomial via a suitable QNN. In particular, we see that by restricting to the class of periodic functions, one can achieve a quadratic reduction of the number of parameters, producing better approximation results than in the literature. Moreover, the smoother the function, the fewer parameters are needed to construct a QNN to approximate the function.
翻译:量子神经网络(QNNs)是量子计算领域中经典神经网络的类比,其由一个具有可训练参数的单位矩阵表示。受经典神经网络普遍逼近性质的启发——即确保在欧几里得空间的紧集上,任意连续函数均可被一致地任意逼近——近期一些研究已为QNNs建立了类似的结果,涵盖从单量子比特到多量子比特QNNs,甚至经典-量子混合模型。本文研究了QNNs在sup范数下对周期函数的逼近能力。我们利用Jackson不等式,通过合适的QNN实现给定函数的逼近三角多项式,从而逼近该函数。特别地,我们发现通过限制在周期函数类中,可以实现参数数量的二次缩减,产生比现有文献更好的逼近结果。此外,函数越光滑,构建用于逼近该函数的QNN所需的参数就越少。