ICML 是 International Conference on Machine Learning的缩写，即国际机器学习大会。ICML如今已发展为由国际机器学习学会（IMLS）主办的年度机器学习国际顶级会议。

### VIP内容

1、联邦学习，Distributed Non-Convex Optimization with Sublinear Speedup under Intermittent Client Availability（分布式非凸优化与次线性加速下的间歇客户端可用性)

2、自监督学习，Improving Molecular Design by Stochastic Iterative Target Augmentation（通过随机迭代目标增强改进分子设计）

UC Berkeley ，MIT ，剑桥 University of Cambridge

https://arxiv.org/abs/2002.04720

3、图神经网络，Graph Prolongation Convolutional Networks: Explicitly Multiscale Machine Learning on Graphs, with Applications to Modeling of Biological Systems（图延长卷积网络:在图上显式多尺度机器学习，用于生物系统建模）

University of California Irvine, Irvine, California, USA.

https://arxiv.org/abs/2002.05842

4、强化学习，Reinforcement Learning Enhanced Quantum-inspired Algorithm for Combinatorial Optimization（强化学习增强的量子启发组合优化算法）

Russian Quantum Center, Moscow, University of Oxford

https://www.zhuanzhi.ai/paper/fb7eef4291247609ac9407e3211cbe1d

5、语言模型，Aligning the Pretraining and Finetuning Objectives of Language Models（对齐语言模型的预训练和微调目标）

https://arxiv.org/abs/2002.02000

6、终身学习，Lipschitz Lifelong Reinforcement Learning（Lipschitz终生强化学习）

ISAE-SUPAERO, Universite de Toulouse, France

https://arxiv.org/abs/2001.05411

7、图学习，Deep Coordination Graphs（深度协调图）

University of Oxford

https://arxiv.org/abs/1910.00091

8、数据隐私权，Certified Data Removal from Machine Learning Models（从机器学习模型中移除经过认证的数据）

Cornell University, New York, USA Facebook AI Research

https://arxiv.org/abs/1911.03030

### 最新论文

Due to the hierarchical structure of many machine learning problems, bilevel programming is becoming more and more important recently, however, the complicated correlation between the inner and outer problem makes it extremely challenging to solve. Although several intuitive algorithms based on the automatic differentiation have been proposed and obtained success in some applications, not much attention has been paid to finding the optimal formulation of the bilevel model. Whether there exists a better formulation is still an open problem. In this paper, we propose an improved bilevel model which converges faster and better compared to the current formulation. We provide theoretical guarantee and evaluation results over two tasks: Data Hyper-Cleaning and Hyper Representation Learning. The empirical results show that our model outperforms the current bilevel model with a great margin. \emph{This is a concurrent work with \citet{liu2020generic} and we submitted to ICML 2020. Now we put it on the arxiv for record.}

Top