图机器学习(Machine Learning on Graphs)是一项重要且普遍存在的任务,其应用范围从药物设计到社交网络中的友情推荐。这个领域的主要挑战是找到一种表示或编码图结构的方法,以便机器学习模型能够轻松地利用它。

知识荟萃

图机器学习( Machine Learning on Graphs)专知荟萃

综述

Deep Learning on Graphs: A Survey

Graphs in machine learning: an introduction

Representation Learning on Graphs: Methods and Applications

Attention Models in Graphs: A Survey

A Survey on Network Embedding

Graph Embedding Techniques, Applications, and Performance: A Survey

A Comprehensive Survey of Graph Embedding: Problems, Techniques and Applications

Graph Neural Networks: A Review of Methods and Applications

模型算法

入门学习

Machine Learning on Graphs @ NeurIPS 2019

Scalable graph machine learning: a mountain we can climb?

How to get started with machine learning on graphs?

Knowing Your Neighbours: Machine Learning on Graphs

Machine learning with graphs: the next big thing?

AI & Graph Technology: How Graphs Accelerate Machine Learning

视频课程

斯坦福课程:图机器学习《CS224W: Machine Learning with Graphs(Stanford / Fall 2019)》by Jurij Leskovec

Octavian Machine Learning on Graphs Course Cohort 1

Machine learning on graphs

代码

StellarGraph Machine Learning Library

Tutorial

Machine Learning over graphs

Practical Machine Learning on Graphs course

VIP内容

【导读】本文为大家带来了一份斯坦福大学的最新课程CS224W——图机器学习,主讲人是斯坦福大牛Jure Leskovec,他是斯坦福大学计算机学院的副教授,也是图表示学习方法 node2vec 和 GraphSAGE 作者之一。

近年来,图神经网络(GNN)成为网络表示学习和分析的热点研究问题,其特点是将以神经网络为代表深度学习技术用于网络结构的建模与计算。图神经网络能够考虑网络中的节点、边及其附带的标签、属性和文本等信息,能够更好地利用网络结构进行精细建模和深度推理,已经被广泛用于自然语言处理、社会网络分析、推荐系统等领域。这个课程应该是近年来第一次全面总结图机器学习相关的课程,课程设置非常新颖也非常全面,包括近年来火热的图神经网络的局限和应用等等,课程全部的PPT 也已经放到网页上,希望做这方面研究的童鞋多多学习!

原始链接: http://web.stanford.edu/class/cs224w/

1 课程介绍

网络是建模复杂的社会、技术和生物系统的基本工具。结合在线社交网络的出现和生物科学中大规模数据的可用性,本课程着重分析大规模网络,这些大型网络提供了一些计算、算法和建模方面的挑战。通过研究学生潜在的网络结构和相互联系,向他们介绍机器学习技术和数据挖掘工具,这些工具有助于揭示社会、技术和自然世界的真知灼见。

复杂数据可以表示为对象之间的关系图。这种网络是模拟社会、技术和生物系统的基本工具。本课程着重于大量图的分析所特有的计算、算法和建模挑战。通过研究基础图结构及其特征,向学生介绍机器学习技术和数据挖掘工具,有助于揭示对各种网络的见解。

主题包括: 表示学习和图神经网络;万维网的算法;知识图推理;影响力最大化;疾病爆发检测,社会网络分析。

2 讲师介绍

Jurij Leskovec

主讲人是图网络领域的大牛Jure Leskovec,是斯坦福大学计算机学院的副教授,也是图表示学习方法 node2vec 和 GraphSAGE 作者之一。在谷歌学术搜索(Google Scholar)上,Jure拥有接近4.5万的论文引用数量,H指数为84。

Leskovec的研究重点是对大型社会和信息网络进行分析和建模,以研究跨社会,技术和自然世界的现象。他专注于网络结构、网络演化、信息传播、影响和病毒在网络上的传播的统计建模。他所研究的问题是由大规模数据、网络和其他在线媒体引起的。他也致力于文本挖掘和机器学习的应用。

个人主页:https://cs.stanford.edu/~jure/

3 课程目录

  • 01:课程介绍和图机器学习(Introduction; Machine Learning for Graphs)
  • 02:传统图机器学习方法(Traditional Methods for ML on Graphs)
  • 03:链接分析:PageRank(Link Analysis: PageRank)
  • 04:节点嵌入(Node Embeddings)
  • 05:标签传播节点分类(Label Propagation for Node Classification)
  • 06:图神经网络模型(Graph Neural Networks 1: GNN Model)
  • 07:图神经网络:设计空间(Graph Neural Networks 2: Design Space)
  • 08:图神经网络应用(Applications of Graph Neural Networks)
  • 09:图神经网络理论(Theory of Graph Neural Networks)
  • 10:图神经网络嵌入(Knowledge Graph Embeddings)
  • 11:知识图谱推理(Reasoning over Knowledge Graphs)
  • 12:基于GNNs的频繁子图挖掘(Frequent Subgraph Mining with GNNs)
  • 13:网络社区结构(Community Structure in Networks)
  • 14:传统图生成式模型(Traditional Generative Models for Graphs)
  • 15:深度图生成式模型(Deep Generative Models for Graphs)
  • 16:GNNs可扩展(Scaling Up GNNs)
  • 17:动态图学习( Learning on Dynamic Graphs)
  • 18:计算生物学GNNs(GNNs for Computational Biology)
  • 19:GNNs科学应用(GNNs for Science)
  • 20:GNNs工业应用 (Industrial Applications of GNNs)

4 课程材料预览

Graph Representation Learning by William L. Hamilton Networks, Crowds, and Markets: Reasoning About a Highly Connected World by David Easley and Jon Kleinberg Network Science by Albert-László Barabási

成为VIP会员查看完整内容
0
85

最新论文

Enabling effective and efficient machine learning (ML) over large-scale graph data (e.g., graphs with billions of edges) can have a huge impact on both industrial and scientific applications. However, community efforts to advance large-scale graph ML have been severely limited by the lack of a suitable public benchmark. For KDD Cup 2021, we present OGB Large-Scale Challenge (OGB-LSC), a collection of three real-world datasets for advancing the state-of-the-art in large-scale graph ML. OGB-LSC provides graph datasets that are orders of magnitude larger than existing ones and covers three core graph learning tasks -- link prediction, graph regression, and node classification. Furthermore, OGB-LSC provides dedicated baseline experiments, scaling up expressive graph ML models to the massive datasets. We show that the expressive models significantly outperform simple scalable baselines, indicating an opportunity for dedicated efforts to further improve graph ML at scale. Our datasets and baseline code are released and maintained as part of our OGB initiative (Hu et al., 2020). We hope OGB-LSC at KDD Cup 2021 can empower the community to discover innovative solutions for large-scale graph ML.

0
0
下载
预览
Top