We consider the two-group classification problem and propose a kernel classifier based on the optimal scoring framework. Unlike previous approaches, we provide theoretical guarantees on the expected risk consistency of the method. We also allow for feature selection by imposing structured sparsity using weighted kernels. We propose fully-automated methods for selection of all tuning parameters, and in particular adapt kernel shrinkage ideas for ridge parameter selection. Numerical studies demonstrate the superior classification performance of the proposed approach compared to existing nonparametric classifiers.
翻译:我们考虑两个组的分类问题,并根据最佳评分框架提出内核分类办法。与以往的做法不同,我们为该方法的预期风险一致性提供理论保证。我们还允许通过使用加权内核强制实施结构化的宽度来选择特征。我们提出了所有调试参数的全自动选择方法,特别是调整脊柱参数选择的内核收缩想法。数字研究表明,与现有的非参数分类方法相比,拟议办法的分类性能优于现有非参数分类方法。