图神经网络 (GNN) 是一种连接模型,它通过图的节点之间的消息传递来捕捉图的依赖关系。与标准神经网络不同的是,图神经网络保留了一种状态,可以表示来自其邻域的具有任意深度的信息。近年来,图神经网络(GNN)在社交网络、知识图、推荐系统、问答系统甚至生命科学等各个领域得到了越来越广泛的应用。

知识荟萃

图神经网络(Graph Neural Networks, GNN)专知荟萃

入门

综述

  • A Comprehensive Survey on Graph Neural Networks. Zonghan Wu, Shirui Pan, Fengwen Chen, Guodong Long, Chengqi Zhang, Philip S. Yu. 2019
    https://arxiv.org/pdf/190-00596.pdf
  • Relational inductive biases, deep learning, and graph networks. Peter W. Battaglia, Jessica B. Hamrick, Victor Bapst, Alvaro Sanchez-Gonzalez, Vinicius Zambaldi, Mateusz Malinowski, Andrea Tacchetti, David Raposo, Adam Santoro, Ryan Faulkner, Caglar Gulcehre, Francis Song, Andrew Ballard, Justin Gilmer, George Dahl, Ashish Vaswani, Kelsey Allen, Charles Nash, Victoria Langston, Chris Dyer, Nicolas Heess, Daan Wierstra, Pushmeet Kohli, Matt Botvinick, Oriol Vinyals, Yujia Li, Razvan Pascanu. 2018.
    https://arxiv.org/pdf/1806.0126-pdf
  • Attention models in graphs. John Boaz Lee, Ryan A. Rossi, Sungchul Kim, Nesreen K. Ahmed, Eunyee Koh. 2018.
    https://arxiv.org/pdf/1807.07984.pdf
  • Deep learning on graphs: A survey. Ziwei Zhang, Peng Cui and Wenwu Zhu. 2018.
    https://arxiv.org/pdf/1812.04202.pdf
  • Graph Neural Networks: A Review of Methods and Applications. Jie Zhou, Ganqu Cui, Zhengyan Zhang, Cheng Yang, Zhiyuan Liu, Maosong Sun. 2018
    https://arxiv.org/pdf/1812.08434.pdf
  • Geometric deep learning: going beyond euclidean data. Michael M. Bronstein, Joan Bruna, Yann LeCun, Arthur Szlam, Pierre Vandergheynst. 2016.
    https://arxiv.org/pdf/161-08097.pdf

进阶论文

Recurrent Graph Neural Networks

Convolutional Graph Neural Networks

Spectral and Spatial

Architecture

Attention Mechanisms

Convolution

Training Methods

Pooling

Bayesian

Analysis

GAE

Spatial-Temporal Graph Neural Networks

应用

Physics

Knowledge Graph

Recommender Systems

  • STAR-GCN: Stacked and Reconstructed Graph Convolutional Networks for Recommender Systems. Jiani Zhang, Xingjian Shi, Shenglin Zhao, Irwin King. IJCAI 2019.
    https://arxiv.org/pdf/1905.13129.pdf

  • Binarized Collaborative Filtering with Distilling Graph Convolutional Networks. Haoyu Wang, Defu Lian, Yong Ge. IJCAI 2019.
    https://arxiv.org/pdf/1906.01829.pdf

  • Graph Contextualized Self-Attention Network for Session-based Recommendation. Chengfeng Xu, Pengpeng Zhao, Yanchi Liu, Victor S. Sheng, Jiajie Xu, Fuzhen Zhuang, Junhua Fang, Xiaofang Zhou. IJCAI 2019.
    https://www.ijcai.org/proceedings/2019/0547.pdf

  • Session-based Recommendation with Graph Neural Networks. Shu Wu, Yuyuan Tang, Yanqiao Zhu, Liang Wang, Xing Xie, Tieniu Tan. AAAI 2019.
    https://arxiv.org/pdf/181-00855.pdf

  • Geometric Hawkes Processes with Graph Convolutional Recurrent Neural Networks. Jin Shang, Mingxuan Sun. AAAI 2019.
    https://jshang2.github.io/pubs/geo.pdf

  • Knowledge-aware Graph Neural Networks with Label Smoothness Regularization for Recommender Systems. Hongwei Wang, Fuzheng Zhang, Mengdi Zhang, Jure Leskovec, Miao Zhao, Wenjie Li, Zhongyuan Wang. KDD 2019.
    https://arxiv.org/pdf/1905.04413

  • Exact-K Recommendation via Maximal Clique Optimization. Yu Gong, Yu Zhu, Lu Duan, Qingwen Liu, Ziyu Guan, Fei Sun, Wenwu Ou, Kenny Q. Zhu. KDD 2019.
    https://arxiv.org/pdf/1905.07089

  • KGAT: Knowledge Graph Attention Network for Recommendation. Xiang Wang, Xiangnan He, Yixin Cao, Meng Liu, Tat-Seng Chua. KDD 2019.
    https://arxiv.org/pdf/1905.07854

  • Knowledge Graph Convolutional Networks for Recommender Systems. Hongwei Wang, Miao Zhao, Xing Xie, Wenjie Li, Minyi Guo. WWW 2019.
    https://arxiv.org/pdf/1904.12575.pdf

  • Dual Graph Attention Networks for Deep Latent Representation of Multifaceted Social Effects in Recommender Systems. Qitian Wu, Hengrui Zhang, Xiaofeng Gao, Peng He, Paul Weng, Han Gao, Guihai Chen. WWW 2019.
    https://arxiv.org/pdf/1903.10433.pdf

  • Graph Neural Networks for Social Recommendation. Wenqi Fan, Yao Ma, Qing Li, Yuan He, Eric Zhao, Jiliang Tang, Dawei Yin. WWW 2019.
    https://arxiv.org/pdf/1902.07243.pdf

  • Graph Convolutional Neural Networks for Web-Scale Recommender Systems. Rex Ying, Ruining He, Kaifeng Chen, Pong Eksombatchai, William L. Hamilton, Jure Leskovec. KDD 2018.
    https://arxiv.org/abs/1806.01973

  • Geometric Matrix Completion with Recurrent Multi-Graph Neural Networks. Federico Monti, Michael M. Bronstein, Xavier Bresson. NIPS 2017.
    https://arxiv.org/abs/1704.06803

  • Graph Convolutional Matrix Completion. Rianne van den Berg, Thomas N. Kipf, Max Welling. 2017.
    https://arxiv.org/abs/1706.02263

Computer Vision

Natural Language Processing

Others

Tutorial

视频教程

代码

领域专家

VIP内容

链接预测是图的一项非常基础的任务。在传统路径学习方法的启发下,本文提出了一种通用的、灵活的基于路径的链接预测表示学习框架。具体来说,我们将节点对的表示定义为所有路径表示的广义和,每个路径表示都是路径中各边表示的广义乘积。受求解最短路径问题的Bellman-Ford算法的启发,我们证明了所提出的路径公式可以被广义Bellman-Ford算法有效地求解。为了进一步提高路径表示的能力,我们提出了神经BellmanFord网络(NBFNet),这是一个通用的图神经网络框架,用于解决广义Bellman-Ford算法中使用学习算子的路径表示。NBFNet将广义Bellman-Ford算法参数化,采用3个神经单元,分别对应边界条件、乘法算子和求和算子。NBFNet是非常通用的,涵盖了许多传统的基于路径的方法,并且可以应用于同构图和多关系图(例如,知识图)在转换和归纳设置。在同构图和知识图谱上的实验表明,所提出的NBFNet在转导和归纳设置方面都大大优于现有方法,取得了最新的研究结果。

https://www.zhuanzhi.ai/paper/15b186a8fcbae87c07eef96f6692c300

成为VIP会员查看完整内容
0
11

最新论文

Few-shot learning presents a challenging paradigm for training discriminative models on a few training samples representing the target classes to discriminate. However, classification methods based on deep learning are ill-suited for such learning as they need large amounts of training data --let alone one-shot learning. Recently, graph neural networks (GNNs) have been introduced to the field of network neuroscience, where the brain connectivity is encoded in a graph. However, with scarce neuroimaging datasets particularly for rare diseases and low-resource clinical facilities, such data-devouring architectures might fail in learning the target task. In this paper, we take a very different approach in training GNNs, where we aim to learn with one sample and achieve the best performance --a formidable challenge to tackle. Specifically, we present the first one-shot paradigm where a GNN is trained on a single population-driven template --namely a connectional brain template (CBT). A CBT is a compact representation of a population of brain graphs capturing the unique connectivity patterns shared across individuals. It is analogous to brain image atlases for neuroimaging datasets. Using a one-representative CBT as a training sample, we alleviate the training load of GNN models while boosting their performance across a variety of classification and regression tasks. We demonstrate that our method significantly outperformed benchmark one-shot learning methods with downstream classification and time-dependent brain graph data forecasting tasks while competing with the train-on-all conventional training strategy. Our source code can be found at https://github.com/basiralab/one-representative-shot-learning.

0
0
下载
预览
Top