We prove that, under low noise assumptions, the support vector machine with $N\ll m$ random features (RFSVM) can achieve the learning rate faster than $O(1/\sqrt{m})$ on a training set with $m$ samples when an optimized feature map is used. Our work extends the previous fast rate analysis of random features method from least square loss to 0-1 loss. We also show that the reweighted feature selection method, which approximates the optimized feature map, helps improve the performance of RFSVM in experiments on a synthetic data set.
翻译:我们证明,在低噪音假设下,具有1 000万美元随机特性的辅助矢量机(RFSVM)在使用最佳地物图时,能够以1 000美元的样本获得超过1 000美元的学习率。我们的工作将以前对随机地物方法的快速速率分析从最低平方损失扩大到0-1损失。我们还表明,与优化地物图相近的再加权地物选择方法有助于改进RFSVM在合成数据集实验中的性能。