在数学,统计学和计算机科学中,尤其是在机器学习和逆问题中,正则化是添加信息以解决不适定问题或防止过度拟合的过程。 正则化适用于不适定的优化问题中的目标函数。

VIP内容

题目: CAST: A Correlation-based Adaptive Spectral Clustering Algorithm on Multi-scale Data

摘要:

本文研究了利用光谱聚类方法对多尺度数据进行聚类的问题。传统的光谱聚类技术通过处理一个反映物体接近度的相似矩阵来发现聚类。对于多尺度数据,基于距离的相似度是无效的,因为稀疏聚类的对象可能相距很远,而密集聚类的对象必须足够近。可以通过将物体的“可达相似性”概念与给定的基于距离的相似性相结合,得到物体的系数矩阵,解决了多尺度数据的光谱聚类问题。本文提出了利用轨迹套索对系数矩阵进行正则化的算法CAST。证明了所得到的系数矩阵具有“分组效应”和“稀疏性”。我们表明,这两个特征意味着非常有效的光谱聚类。我们评估CAST和其它10种聚类方法在广泛的数据集w.r.t.各种应用。实验结果表明,该算法在多尺度数据的测试用例中具有良好的鲁棒性。

成为VIP会员查看完整内容
0
13

最新内容

Neural plasticity is an important functionality of human brain, in which number of neurons and synapses can shrink or expand in response to stimuli throughout the span of life. We model this dynamic learning process as an $L_0$-norm regularized binary optimization problem, in which each unit of a neural network (e.g., weight, neuron or channel, etc.) is attached with a stochastic binary gate, whose parameters determine the level of activity of a unit in the network. At the beginning, only a small portion of binary gates (therefore the corresponding neurons) are activated, while the remaining neurons are in a hibernation mode. As the learning proceeds, some neurons might be activated or deactivated if doing so can be justified by the cost-benefit tradeoff measured by the $L_0$-norm regularized objective. As the training gets mature, the probability of transition between activation and deactivation will diminish until a final hardening stage. We demonstrate that all of these learning dynamics can be modulated by a single parameter $k$ seamlessly. Our neural plasticity network (NPN) can prune or expand a network depending on the initial capacity of network provided by the user; it also unifies dropout (when $k=0$), traditional training of DNNs (when $k=\infty$) and interpolates between these two. To the best of our knowledge, this is the first learning framework that unifies network sparsification and network expansion in an end-to-end training pipeline. Extensive experiments on synthetic dataset and multiple image classification benchmarks demonstrate the superior performance of NPN. We show that both network sparsification and network expansion can yield compact models of similar architectures, while retaining competitive accuracies of the original networks.

0
0
下载
预览

最新论文

Neural plasticity is an important functionality of human brain, in which number of neurons and synapses can shrink or expand in response to stimuli throughout the span of life. We model this dynamic learning process as an $L_0$-norm regularized binary optimization problem, in which each unit of a neural network (e.g., weight, neuron or channel, etc.) is attached with a stochastic binary gate, whose parameters determine the level of activity of a unit in the network. At the beginning, only a small portion of binary gates (therefore the corresponding neurons) are activated, while the remaining neurons are in a hibernation mode. As the learning proceeds, some neurons might be activated or deactivated if doing so can be justified by the cost-benefit tradeoff measured by the $L_0$-norm regularized objective. As the training gets mature, the probability of transition between activation and deactivation will diminish until a final hardening stage. We demonstrate that all of these learning dynamics can be modulated by a single parameter $k$ seamlessly. Our neural plasticity network (NPN) can prune or expand a network depending on the initial capacity of network provided by the user; it also unifies dropout (when $k=0$), traditional training of DNNs (when $k=\infty$) and interpolates between these two. To the best of our knowledge, this is the first learning framework that unifies network sparsification and network expansion in an end-to-end training pipeline. Extensive experiments on synthetic dataset and multiple image classification benchmarks demonstrate the superior performance of NPN. We show that both network sparsification and network expansion can yield compact models of similar architectures, while retaining competitive accuracies of the original networks.

0
0
下载
预览
Top