While large margin classifiers are originally an outcome of an optimization framework, support vectors (SVs) can be obtained from geometric approaches. This article presents advances in the use of Gabriel graphs (GGs) in binary and multiclass classification problems. For Chipclass, a hyperparameter-less and optimization-less GG-based binary classifier, we discuss how activation functions and support edge (SE)-centered neurons affect the classification, proposing smoother functions and structural SV (SSV)-centered neurons to achieve margins with low probabilities and smoother classification contours. We extend the neural network architecture, which can be trained with backpropagation with a softmax function and a cross-entropy loss, or by solving a system of linear equations. A new subgraph-/distance-based membership function for graph regularization is also proposed, along with a new GG recomputation algorithm that is less computationally expensive than the standard approach. Experimental results with the Friedman test show that our method was better than previous GG-based classifiers and statistically equivalent to tree-based models.


翻译:虽然大间隔分类器最初源自优化框架,但支持向量(SVs)可通过几何方法获得。本文介绍了Gabriel图(GGs)在二分类与多分类问题中的应用进展。针对Chipclass——一种无需超参数与优化的GG基二分类器,我们讨论了激活函数与支持边(SE)中心神经元如何影响分类,提出采用更平滑的函数与结构支持向量(SSV)中心神经元,以实现低概率间隔与更平滑的分类边界。我们扩展了神经网络架构,该架构可通过带softmax函数与交叉熵损失的误差反向传播进行训练,或通过求解线性方程组实现。本文还提出了一种新的基于子图/距离的图正则化隶属函数,以及一种计算成本低于标准方法的新型GG重计算算法。Friedman检验的实验结果表明,我们的方法优于以往的GG基分类器,并与基于树的模型在统计上等效。

0
下载
关闭预览

相关内容

Top
微信扫码咨询专知VIP会员