自适应学习,也被称为自适应教学,是使用计算机算法来协调与学习者的互动,并提供定制学习资源和学习活动来解决每个学习者的独特需求的教育方法。在专业的学习情境,个人可以“试验出”一些训练方式,以确保教学内容的更新。根据学生的学习需要,计算机生成适应其特点的教育材料,包括他们对问题的回答和完成的任务和经验。该技术涵盖了各个研究领域和它们的衍生,包括计算机科学、人工智能、心理测验、教育学、心理学和脑科学。

VIP内容

题目: Adaptive Graph Encoder for Attributed Graph Embedding

简介: 从图拓扑和节点特征中学习向量表示的属性图嵌入是图分析的一项艰巨任务。近年来,基于图卷积网络(GCN)的方法在此任务上取得了很大的进步。但是,现有的基于GCN的方法具有三个主要缺点。首先,我们的实验表明图卷积滤波器和权重矩阵的纠缠将损害性能和鲁棒性。其次,我们证明了这些方法中的图卷积滤波器是广义拉普拉斯平滑滤波器的特例,但它们并未保留最佳的低通特性。最后,现有算法的训练目标通常是恢复与现实应用并不总是一致的邻接矩阵或特征矩阵,为了解决这些问题,我们提出了一种新型的属性图嵌入框架Adaptive Graph Encoder(AGE)。 AGE由两个模块组成:(1)为了更好地减轻节点特征中的高频噪声,AGE首先应用了精心设计的拉普拉斯平滑滤波器。 (2)AGE采用了自适应编码器,该编码器迭代地增强了滤波后的特征,以实现更好的节点嵌入。我们使用四个公共基准数据集进行实验,以验证AGE在节点群集和链接预测任务上的作用。实验结果表明,AGE在这些任务上始终优于最新的图形嵌入方法。

成为VIP会员查看完整内容
0
52

最新内容

Symmetric nonnegative matrix factorization (SNMF) has demonstrated to be a powerful method for data clustering. However, SNMF is mathematically formulated as a non-convex optimization problem, making it sensitive to the initialization of variables. Inspired by ensemble clustering that aims to seek a better clustering result from a set of clustering results, we propose self-supervised SNMF (S$^3$NMF), which is capable of boosting clustering performance progressively by taking advantage of the sensitivity to initialization characteristic of SNMF, without relying on any additional information. Specifically, we first perform SNMF repeatedly with a random nonnegative matrix for initialization each time, leading to multiple decomposed matrices. Then, we rank the quality of the resulting matrices with adaptively learned weights, from which a new similarity matrix that is expected to be more discriminative is reconstructed for SNMF again. These two steps are iterated until the stopping criterion/maximum number of iterations is achieved. We mathematically formulate S$^3$NMF as a constraint optimization problem, and provide an alternative optimization algorithm to solve it with the theoretical convergence guaranteed. Extensive experimental results on $10$ commonly used benchmark datasets demonstrate the significant advantage of our S$^3$NMF over $12$ state-of-the-art methods in terms of $5$ quantitative metrics. The source code is publicly available at https://github.com/jyh-learning/SSSNMF.

0
0
下载
预览

最新论文

Symmetric nonnegative matrix factorization (SNMF) has demonstrated to be a powerful method for data clustering. However, SNMF is mathematically formulated as a non-convex optimization problem, making it sensitive to the initialization of variables. Inspired by ensemble clustering that aims to seek a better clustering result from a set of clustering results, we propose self-supervised SNMF (S$^3$NMF), which is capable of boosting clustering performance progressively by taking advantage of the sensitivity to initialization characteristic of SNMF, without relying on any additional information. Specifically, we first perform SNMF repeatedly with a random nonnegative matrix for initialization each time, leading to multiple decomposed matrices. Then, we rank the quality of the resulting matrices with adaptively learned weights, from which a new similarity matrix that is expected to be more discriminative is reconstructed for SNMF again. These two steps are iterated until the stopping criterion/maximum number of iterations is achieved. We mathematically formulate S$^3$NMF as a constraint optimization problem, and provide an alternative optimization algorithm to solve it with the theoretical convergence guaranteed. Extensive experimental results on $10$ commonly used benchmark datasets demonstrate the significant advantage of our S$^3$NMF over $12$ state-of-the-art methods in terms of $5$ quantitative metrics. The source code is publicly available at https://github.com/jyh-learning/SSSNMF.

0
0
下载
预览
Top