Capsule Networks (CapsNets), recently proposed by the Google Brain team, have superior learning capabilities in machine learning tasks, like image classification, compared to the traditional CNNs. However, CapsNets require extremely intense computations and are difficult to be deployed in their original form at the resource-constrained edge devices. This paper makes the first attempt to quantize CapsNet models, to enable their efficient edge implementations, by developing a specialized quantization framework for CapsNets. We evaluate our framework for several benchmarks. On a deep CapsNet model for the CIFAR10 dataset, the framework reduces the memory footprint by 6.2x, with only 0.15% accuracy loss. We will open-source our framework at https://git.io/JvDIF in August 2020.
翻译:Capsule Networks(Capsule Nets)是谷歌大脑小组最近提出的,与传统的CNN相比,在像图像分类这样的机器学习任务方面,与传统的CNN系统相比,CapsNets(Capsule Nets)具有超强的学习能力,但是,CapsNets需要极其密集的计算,难以在资源限制的边缘装置上以原始形式部署,本文首次尝试对CapsNet模式进行量化,以便通过为CapsNets开发一个专门的边缘执行框架。我们评估了我们的若干基准框架。在CapsNet10数据集的深层 CapsNet模型中,CapsNet将记忆足迹减少6.2x,只有0.15%的准确损失。我们将在2020年8月在https://git.io/JvDIF打开我们的框架。