Sketching is a randomized dimensionality-reduction method that aims to preserve relevant information in large-scale datasets. Count sketch is a simple popular sketch which uses a randomized hash function to achieve compression. In this paper, we propose a novel extension known as Higher-order Count Sketch (HCS). While count sketch uses a single hash function, HCS uses multiple (smaller) hash functions for sketching. HCS reshapes the input (vector) data into a higher-order tensor and employs a tensor product of the random hash functions to compute the sketch. This results in an exponential saving (with respect to the order of the tensor) in the memory requirements of the hash functions, under certain conditions on the input data. Furthermore, when the input data itself has an underlying structure in the form of various tensor representations such as the Tucker decomposition, we obtain significant advantages. We derive efficient (approximate) computation of various tensor operations such as tensor products and tensor contractions directly on the sketched data. Thus, HCS is the first sketch to fully exploit the multi-dimensional nature of higher-order tensors. We apply HCS to tensorized neural networks where we replace fully connected layers with sketched tensor operations. We achieve nearly state of the art accuracy with significant compression on the image classification benchmark.
翻译:切除是一种随机的维度递减方法, 目的是在大型数据集中保存相关信息。 计数草图是一种简单的流行素描, 使用随机散列函数来实现压缩。 在本文中, 我们提议了一个新的扩展, 称为高级顺序计数 Strach (HCS) 。 虽然计数草图使用单一散列函数, HCS 使用多种( 较小) 散列函数来绘制草图。 HCS 将输入( 矢量) 数据重塑成一个更高顺序的收缩器, 并使用随机散列函数的收缩产品来计算草图。 因此, HCS 是在输入数据的某些条件下, 将散列函数的存储要求指数化( 与 散列函数的顺序相匹配) 。 此外, 当输入数据本身具有以各种数位表示形式形成基本结构时, 例如塔克变形变形变形( 小) 功能, 我们获得显著的优势。 我们通过光谱数据来高效( 近) 计算各种 的数控号操作, 例如 和感应直接计算成色数据 。 因此, HCSAS 完全利用高基 的 的 级 和 等 级 的 级 的 的 的 级, 。