While modern convolutional neural networks achieve outstanding accuracy on many image classification tasks, they are, compared to humans, much more sensitive to image degradation. Here, we describe a variant of Batch Normalization, LocalNorm, that regularizes the normalization layer in the spirit of Dropout while dynamically adapting to the local image intensity and contrast at test-time. We show that the resulting deep neural networks are much more resistant to noise-induced image degradation, improving accuracy by up to three times, while achieving the same or slightly better accuracy on non-degraded classical benchmarks. In computational terms, LocalNorm adds negligible training cost and little or no cost at inference time, and can be applied to already-trained networks in a straightforward manner.
翻译:虽然现代进化神经网络在许多图像分类任务上取得了显著的准确性,但与人类相比,它们对于图像降解更为敏感。这里,我们描述的是批量正常化的变种,即本地神经网络,它以辍学精神规范正常化层,同时动态地适应当地图像强度和测试时的对比。我们表明,由此形成的深层神经网络对噪音引起的图像退化的抵抗力要大得多,提高准确性高达三倍,同时在非降级古典基准上达到相同或略微更高的准确性。 在计算术语中,本地神经网络增加了微不足道的培训成本,在推断时成本很少或没有成本,并且可以直接应用于已经受过训练的网络。