Optimization problem with quadratic equality constraints are prevalent in machine learning. Indeed, two important examples are Canonical Correlation Analysis (CCA) and Linear Discriminant Analysis (LDA). Unfortunately, methods for solving such problems typically involve computing matrix inverses and decomposition. For the aforementioned problems, these matrices are actually Gram matrices of input data matrices, and as such the computations are too expensive for large scale datasets. In this paper, we propose a sketching based approach for solving CCA and LDA that reduces the cost dependence on the input size. The proposed algorithms feature randomized preconditioning combined with Riemannian optimization.
翻译:机器学习中普遍存在四等平等制约的最佳化问题,事实上,有两个重要的例子,即Canonical conclocation 和线性差异分析(LDA ) 。 不幸的是,解决这些问题的方法通常涉及计算矩阵反向和分解。对于上述问题,这些矩阵实际上是输入数据矩阵的Gram 矩阵,因此计算对于大型数据集来说太昂贵了。在本文件中,我们提出了一种基于草图的解决铬化和线性差异分析(LDA)的方法,以减少对投入规模的成本依赖。拟议的算法具有随机化的先决条件,加上Riemannian的优化。