【Knowledge Distillation】知识蒸馏论文整理
1. Towards Cross-Modality Medical Image Segmentation with Online Mutual Knowledge Distillation
会议:AAAI 2020. AAAI Technical Track: Applications.
作者:Kang Li, Lequan Yu, Shujun Wang, Pheng-Ann Heng
链接:https://aaai.org/ojs/index.php/AAAI/article/view/5421/5277
2. Online Knowledge Distillation with Diverse Peers
会议:AAAI 2020. AAAI Technical Track: Machine Learning.
作者:Defang Chen, Jian-Ping Mei, Can Wang, Yan Feng, Chun Chen
链接:https://aaai.org/ojs/index.php/AAAI/article/view/5746/5602
3. Towards Oracle Knowledge Distillation with Neural Architecture Search
会议:AAAI 2020. AAAI Technical Track: Machine Learning.
作者:Minsoo Kang, Jonghwan Mun, Bohyung Han
链接:https://aaai.org/ojs/index.php/AAAI/article/view/5866/5722
4. Improved Knowledge Distillation via Teacher Assistant
会议:AAAI 2020. AAAI Technical Track: Machine Learning.
作者:Seyed-Iman Mirzadeh, Mehrdad Farajtabar, Ang Li, Nir Levine, Akihiro Matsukawa, Hassan Ghasemzadeh
链接:https://aaai.org/ojs/index.php/AAAI/article/view/5963/5819
5. Knowledge Distillation from Internal Representations
会议:AAAI 2020. AAAI Technical Track: Natural Language Processing.
作者:Gustavo Aguilar, Yuan Ling, Yu Zhang, Benjamin Yao, Xing Fan, Chenlei Guo
链接:https://aaai.org/ojs/index.php/AAAI/article/view/6229/6085
6. Ultrafast Video Attention Prediction with Coupled Knowledge Distillation
会议:AAAI 2020. AAAI Technical Track: Vision.
作者:Kui Fu, Peipei Shi, Yafei Song, Shiming Ge, Xiangju Lu, Jia Li
链接:https://aaai.org/ojs/index.php/AAAI/article/view/6710/6564
7. Uncertainty-Aware Multi-Shot Knowledge Distillation for Image-Based Object Re-Identification
会议:AAAI 2020. AAAI Technical Track: Vision.
作者:Xin Jin, Cuiling Lan, Wenjun Zeng, Zhibo Chen
链接:https://aaai.org/ojs/index.php/AAAI/article/view/6774/6628
8. Structure-Level Knowledge Distillation For Multilingual Sequence Labeling
会议:ACL 2020.
作者:Xinyu Wang, Yong Jiang, Nguyen Bach, Tao Wang, Fei Huang, Kewei Tu
链接:https://www.aclweb.org/anthology/2020.acl-main.304.pdf
9. Knowledge Distillation for Multilingual Unsupervised Neural Machine Translation
会议:ACL 2020.
作者:Haipeng Sun, Rui Wang, Kehai Chen, Masao Utiyama, Eiichiro Sumita, Tiejun Zhao
链接:https://www.aclweb.org/anthology/2020.acl-main.324.pdf
10. TextBrewer: An Open-Source Knowledge Distillation Toolkit for Natural Language Processing
会议:ACL 2020. System Demonstrations.
作者:Ziqing Yang, Yiming Cui, Zhipeng Chen, Wanxiang Che, Ting Liu, Shijin Wang, Guoping Hu
链接:https://www.aclweb.org/anthology/2020.acl-demos.2.pdf
11. End-to-End Speech-Translation with Knowledge Distillation: FBK@IWSLT2020
会议:ACL 2020. the 17th International Conference on Spoken Language Translation.
作者:Marco Gaido, Mattia A. Di Gangi, Matteo Negri, Marco Turchi
链接:https://www.aclweb.org/anthology/2020.iwslt-1.8.pdf
12. Exploring the Limits of Simple Learners in Knowledge Distillation for Document Classification with DocBERT
会议:ACL 2020. the 5th Workshop on Representation Learning for NLP.
作者:Ashutosh Adhikari, Achyudh Ram, Raphael Tang, William L. Hamilton, Jimmy Lin
链接:https://www.aclweb.org/anthology/2020.repl4nlp-1.10.pdf
13. Neural Networks Are More Productive Teachers Than Human Raters: Active Mixup for Data-Efficient Knowledge Distillation From a Blackbox Model
会议:CVPR 2020.
作者:Dongdong Wang, Yandong Li, Liqiang Wang, Boqing Gong
链接:https://openaccess.thecvf.com/content_CVPR_2020/papers/Wang_Neural_Networks_Are_More_Productive_Teachers_Than_Human_Raters_Active_CVPR_2020_paper.pdf
14. Creating Something From Nothing: Unsupervised Knowledge Distillation for Cross-Modal Hashing
会议:CVPR 2020.
作者:Hengtong Hu, Lingxi Xie, Richang Hong, Qi Tian
链接:https://openaccess.thecvf.com/content_CVPR_2020/papers/Hu_Creating_Something_From_Nothing_Unsupervised_Knowledge_Distillation_for_Cross-Modal_Hashing_CVPR_2020_paper.pdf
15. Heterogeneous Knowledge Distillation Using Information Flow Modeling
会议:CVPR 2020.
作者:Nikolaos Passalis, Maria Tzelepi, Anastasios Tefas
链接:https://openaccess.thecvf.com/content_CVPR_2020/papers/Passalis_Heterogeneous_Knowledge_Distillation_Using_Information_Flow_Modeling_CVPR_2020_paper.pdf
16. Revisiting Knowledge Distillation via Label Smoothing Regularization
会议:CVPR 2020.
作者:Li Yuan, Francis EH Tay, Guilin Li, Tao Wang, Jiashi Feng
链接:https://openaccess.thecvf.com/content_CVPR_2020/papers/Yuan_Revisiting_Knowledge_Distillation_via_Label_Smoothing_Regularization_CVPR_2020_paper.pdf
17. Block-Wisely Supervised Neural Architecture Search With Knowledge Distillation
会议:CVPR 2020.
作者:Changlin Li, Jiefeng Peng, Liuchun Yuan, Guangrun Wang, Xiaodan Liang, Liang Lin, Xiaojun Chang
链接:https://openaccess.thecvf.com/content_CVPR_2020/papers/Li_Block-Wisely_Supervised_Neural_Architecture_Search_With_Knowledge_Distillation_CVPR_2020_paper.pdf
18. Few Sample Knowledge Distillation for Efficient Network Compression
会议:CVPR 2020.
作者:Tianhong Li, Jianguo Li, Zhuang Liu, Changshui Zhang
链接:https://openaccess.thecvf.com/content_CVPR_2020/papers/Li_Few_Sample_Knowledge_Distillation_for_Efficient_Network_Compression_CVPR_2020_paper.pdf
19. Online Knowledge Distillation via Collaborative Learning
会议:CVPR 2020.
作者:Qiushan Guo, Xinjiang Wang, Yichao Wu, Zhipeng Yu, Ding Liang, Xiaolin Hu, Ping Luo
链接:https://openaccess.thecvf.com/content_CVPR_2020/papers/Guo_Online_Knowledge_Distillation_via_Collaborative_Learning_CVPR_2020_paper.pdf
20. Circumventing Outliers of AutoAugment with Knowledge Distillation
会议:ECCV 2020.
作者:Longhui Wei, An Xiao, Lingxi Xie, Xiaopeng Zhang, Xin Chen, Qi Tian
链接:https://www.ecva.net/papers/eccv_2020/papers_ECCV/papers/123480613.pdf
21. Learning From Multiple Experts: Self-paced Knowledge Distillation for Long-tailed Classification
会议:ECCV 2020.
作者:Liuyu Xiang, Guiguang Ding, Jungong Han
链接:https://www.ecva.net/papers/eccv_2020/papers_ECCV/papers/123500239.pdf
22. Knowledge Distillation Meets Self-Supervision
会议:ECCV 2020.
作者:Guodong Xu, Ziwei Liu, Xiaoxiao Li, Chen Change Loy
链接:https://www.ecva.net/papers/eccv_2020/papers_ECCV/papers/123540562.pdf
23. Robust Re-Identification by Multiple Views Knowledge Distillation
会议:ECCV 2020.
作者:Angelo Porrello, Luca Bergamini, Simone Calderara
链接:https://www.ecva.net/papers/eccv_2020/papers_ECCV/papers/123550103.pdf
24. Local Correlation Consistency for Knowledge Distillation
会议:ECCV 2020.
作者:Xiaojie Li, Jianlong Wu, Hongyu Fang, Yue Liao, Fei Wang, Chen Qian
链接:https://www.ecva.net/papers/eccv_2020/papers_ECCV/papers/123570018.pdf
25. AMLN: Adversarial-based Mutual Learning Network for Online Knowledge Distillation
会议:ECCV 2020.
作者:Xiaobing Zhang, Shijian Lu, Haigang Gong, Zhipeng Luo, Ming Liu
链接:https://www.ecva.net/papers/eccv_2020/papers_ECCV/papers/123570154.pdf
26. Differentiable Feature Aggregation Search for Knowledge Distillation
会议:ECCV 2020.
作者:Yushuo Guan, Pengyu Zhao, Bingxuan Wang, Yuanxing Zhang, Cong Yao, Kaigui Bian, Jian Tang
链接:https://www.ecva.net/papers/eccv_2020/papers_ECCV/papers/123620460.pdf
27. Online Ensemble Model Compression using Knowledge Distillation
会议:ECCV 2020.
作者:Devesh Walawalkar, Zhiqiang Shen, Marios Savvides
链接:https://www.ecva.net/papers/eccv_2020/papers_ECCV/papers/123640018.pdf
28. Exclusivity-Consistency Regularized Knowledge Distillation for Face Recognition
会议:ECCV 2020.
作者:Xiaobo Wang, Tianyu Fu, Shengcai Liao, Shuo Wang, Zhen Lei, Tao Mei
链接:https://www.ecva.net/papers/eccv_2020/papers_ECCV/papers/123690324.pdf
29. Feature Normalized Knowledge Distillation for Image Classification
会议:ECCV 2020.
作者:Kunran Xu, Lai Rui, Yishi Li, Lin Gu
链接:https://www.ecva.net/papers/eccv_2020/papers_ECCV/papers/123700664.pdf
30. Weight Decay Scheduling and Knowledge Distillation for Active Learning
会议:ECCV 2020.
作者:Juseung Yun, Byungjoo Kim, Junmo Kim
链接:https://www.ecva.net/papers/eccv_2020/papers_ECCV/papers/123710426.pdf
31. Semantic Relation Preserving Knowledge Distillation for Image-to-Image Translation
会议:ECCV 2020.
作者:Zeqi Li, Ruowei Jiang,, Parham Aarabi
链接:https://www.ecva.net/papers/eccv_2020/papers_ECCV/papers/123710647.pdf
32. Interpretable Foreground Object Search As Knowledge Distillation
会议:ECCV 2020.
作者:Boren Li, Po-Yu Zhuang, Jian Gu, Mingyang Li, Ping Tan
链接:https://www.ecva.net/papers/eccv_2020/papers_ECCV/papers/123730188.pdf
33. Improving Knowledge Distillation via Category Structure
会议:ECCV 2020.
作者:Zailiang Chen, Xianxian Zheng, Hailan Shen, Ziyang Zeng, Yukun Zhou, Rongchang Zhao
链接:https://www.ecva.net/papers/eccv_2020/papers_ECCV/papers/123730205.pdf
34. Making Monolingual Sentence Embeddings Multilingual Using Knowledge Distillation
会议:EMNLP 2020. Long Paper.
作者:Nils Reimers, Iryna Gurevych
链接:https://www.aclweb.org/anthology/2020.emnlp-main.365.pdf
35. Lifelong Language Knowledge Distillation
会议:EMNLP 2020. Long Paper.
作者:Yung-Sung Chuang, Shang-Yu Su, Yun-Nung Chen
链接:https://www.aclweb.org/anthology/2020.emnlp-main.233.pdf
36. Autoregressive Knowledge Distillation through Imitation Learning
会议:EMNLP 2020. Long Paper.
作者:Alexander Lin, Jeremy Wohlwend, Howard Chen, Tao Lei
链接:https://www.aclweb.org/anthology/2020.emnlp-main.494.pdf
37. Improving Neural Topic Models Using Knowledge Distillation
会议:EMNLP 2020. Long Paper.
作者:Alexander Miserlis Hoyle, Pranav Goel, Philip Resnik
链接:https://www.aclweb.org/anthology/2020.emnlp-main.137.pdf
38. Why Skip If You Can Combine: A Simple Knowledge Distillation Technique for Intermediate Layers
会议:EMNLP 2020. Short Paper.
作者:Yimeng Wu, Peyman Passban, Mehdi Rezagholizadeh, Qun Liu
链接:https://www.aclweb.org/anthology/2020.emnlp-main.74.pdf
39. Understanding Knowledge Distillation in Non-autoregressive Machine Translation
会议:ICLR 2020.
作者:Chunting Zhou, Jiatao Gu, Graham Neubig
链接:https://openreview.net/pdf?id=BygFVAEKDH
40. P-KDGAN: Progressive Knowledge Distillation with GANs for One-class Novelty Detection
会议:IJCAI 2020.
作者:Zhiwei Zhang, Shifeng Chen, Lei Sun
链接:https://www.ijcai.org/proceedings/2020/0448.pdf
41. Private Model Compression via Knowledge Distillation
会议:AAAI 2019. AAAI Technical Track: Applications.
作者:Ji Wang, Weidong Bao, Lichao Sun, Xiaomin Zhu, Bokai Cao, Philip S. Yu
链接:https://aaai.org/ojs/index.php/AAAI/article/view/3913/3791
42. Knowledge Distillation with Adversarial Samples Supporting Decision Boundary
会议:AAAI 2019. AAAI Technical Track: Machine Learning.
作者:Byeongho Heo, Minsik Lee, Sangdoo Yun, Jin Young Choi
链接:https://www.aaai.org/ojs/index.php/AAAI/article/view/4263/4141
43. Exploiting the Ground-Truth: An Adversarial Imitation Based Knowledge Distillation Approach for Event Detection
会议:AAAI 2019. AAAI Technical Track: Natural Language Processing.
作者:Jian Liu, Yubo Chen, Kang Liu
链接:https://aaai.org/ojs/index.php/AAAI/article/view/4649/4527
44. Scalable Syntax-Aware Language Models Using Knowledge Distillation
会议:ACL 2019.
作者:Adhiguna Kuncoro, Chris Dyer, Laura Rimell, Stephen Clark, Phil Blunsom
链接:https://www.aclweb.org/anthology/P19-1337.pdf
45. PANLP at MEDIQA 2019: Pre-trained Language Models, Transfer Learning and Knowledge Distillation
会议:ACL 2019. the 18th BioNLP Workshop and Shared Task.
作者:Wei Zhu, Xiaofeng Zhou, Keqiang Wang, Xun Luo, Xiepeng Li, Yuan Ni, Guotong Xie
链接:https://www.aclweb.org/anthology/W19-5040.pdf
46. Structured Knowledge Distillation for Semantic Segmentation
会议:CVPR 2019.
作者:Yifan Liu, Ke Chen, Chris Liu, Zengchang Qin, Zhenbo Luo, Jingdong Wang
链接:https://openaccess.thecvf.com/content_CVPR_2019/papers/Liu_Structured_Knowledge_Distillation_for_Semantic_Segmentation_CVPR_2019_paper.pdf
47. Relational Knowledge Distillation
会议:CVPR 2019.
作者:Wonpyo Park, Dongju Kim, Yan Lu, Minsu Cho
链接:https://openaccess.thecvf.com/content_CVPR_2019/papers/Park_Relational_Knowledge_Distillation_CVPR_2019_paper.pdf
48. Knowledge Distillation via Instance Relationship Graph
会议:CVPR 2019.
作者:Yufan Liu, Jiajiong Cao, Bing Li, Chunfeng Yuan, Weiming Hu, Yangxi Li, Yunqiang Duan
链接:https://openaccess.thecvf.com/content_CVPR_2019/papers/Liu_Knowledge_Distillation_via_Instance_Relationship_Graph_CVPR_2019_paper.pdf
49. Refine and Distill: Exploiting Cycle-Inconsistency and Knowledge Distillation for Unsupervised Monocular Depth Estimation
会议:CVPR 2019.
作者:Andrea Pilzer, Stephane Lathuiliere, Nicu Sebe, Elisa Ricci
链接:https://openaccess.thecvf.com/content_CVPR_2019/papers/Pilzer_Refine_and_Distill_Exploiting_Cycle-Inconsistency_and_Knowledge_Distillation_for_Unsupervised_CVPR_2019_paper.pdf
50. Patient Knowledge Distillation for BERT Model Compression
会议:EMNLP 2019.
作者:Siqi Sun, Yu Cheng, Zhe Gan, Jingjing Liu
链接:https://www.aclweb.org/anthology/D19-1441.pdf
51. Weakly Supervised Cross-lingual Semantic Relation Classification via Knowledge Distillation
会议:EMNLP 2019.
作者:Yogarshi Vyas, Marine Carpuat
链接:https://www.aclweb.org/anthology/D19-1532.pdf
52. Natural Language Generation for Effective Knowledge Distillation
会议:EMNLP 2019. the 2nd Workshop on Deep Learning Approaches for Low-Resource NLP (DeepLo 2019).
作者:Raphael Tang, Yao Lu, Jimmy Lin
链接:https://www.aclweb.org/anthology/D19-6122.pdf
53. Knowledge Distillation via Route Constrained Optimization
会议:ICCV 2019.
作者:Xiao Jin, Baoyun Peng, Yichao Wu, Yu Liu, Jiaheng Liu, Ding Liang, Junjie Yan, Xiaolin Hu
链接:https://openaccess.thecvf.com/content_ICCV_2019/papers/Jin_Knowledge_Distillation_via_Route_Constrained_Optimization_ICCV_2019_paper.pdf
54. Similarity-Preserving Knowledge Distillation
会议:ICCV 2019.
作者:Frederick Tung, Greg Mori
链接:https://openaccess.thecvf.com/content_ICCV_2019/papers/Tung_Similarity-Preserving_Knowledge_Distillation_ICCV_2019_paper.pdf
55. On the Efficacy of Knowledge Distillation
会议:ICCV 2019.
作者:Jang Hyun Cho, Bharath Hariharan
链接:https://openaccess.thecvf.com/content_ICCV_2019/papers/Cho_On_the_Efficacy_of_Knowledge_Distillation_ICCV_2019_paper.pdf
56. Correlation Congruence for Knowledge Distillation
会议:ICCV 2019.
作者:Baoyun Peng, Xiao Jin, Jiaheng Liu, Dongsheng Li, Yichao Wu, Yu Liu, Shunfeng Zhou, Zhaoning Zhang
链接:https://openaccess.thecvf.com/content_ICCV_2019/papers/Peng_Correlation_Congruence_for_Knowledge_Distillation_ICCV_2019_paper.pdf
57. Multilingual Neural Machine Translation with Knowledge Distillation
会议:ICLR 2019.
作者:Xu Tan, Yi Ren, Di He, Tao Qin, Zhou Zhao, Tie-Yan Liu
链接:https://openreview.net/pdf?id=S1gUsoR9YX
58. Zero-Shot Knowledge Distillation in Deep Networks
会议:ICML 2019.
作者:Gaurav Kumar Nayak, Konda Reddy Mopuri, Vaisakh Shaj, Venkatesh Babu Radhakrishnan, Anirban Chakraborty
链接:http://proceedings.mlr.press/v97/nayak19a/nayak19a.pdf
59. Towards Understanding Knowledge Distillation
会议:ICML 2019.
作者:Mary Phuong, Christoph Lampert
链接:http://proceedings.mlr.press/v97/phuong19a/phuong19a.pdf
60. Pedestrian Attribute Recognition by Joint Visual-semantic Reasoning and Knowledge Distillation
会议:IJCAI 2019.
作者:Qiaozhe Li, Xin Zhao, Ran He, Kaiqi Huang
链接:https://www.ijcai.org/proceedings/2019/0117.pdf
61. On Knowledge distillation from complex networks for response prediction
会议:NAACL 2019.
作者:Siddhartha Arora, Mitesh M. Khapra, Harish G. Ramaswamy
链接:https://www.aclweb.org/anthology/N19-1382.pdf
62. Self-supervised Knowledge Distillation Using Singular Value Decomposition
会议:ECCV 2018.
作者:Seung Hyun Lee, Dae Ha Kim, Byung Cheol Song
链接:https://openaccess.thecvf.com/content_ECCV_2018/papers/SEUNG_HYUN_LEE_Self-supervised_Knowledge_Distillation_ECCV_2018_paper.pdf
63. Apprentice: Using Knowledge Distillation Techniques To Improve Low-Precision Network Accuracy
会议:ICLR 2018.
作者:Asit K. Mishra, Debbie Marr
链接:https://openreview.net/pdf?id=B1ae1lZRb
64. Progressive Blockwise Knowledge Distillation for Neural Network Acceleration
会议:IJCAI 2018.
作者:Hui Wang, Hanbin Zhao, Xi Li, Xu Tan
链接:https://www.ijcai.org/proceedings/2018/0384.pdf
65. KDGAN: Knowledge Distillation with Generative Adversarial Networks
会议:NeurIPS 2018.
作者:Xiaojie Wang, Rui Zhang, Yu Sun, Jianzhong Qi
链接:https://papers.nips.cc/paper/7358-kdgan-knowledge-distillation-with-generative-adversarial-networks.pdf
66. Knowledge Distillation by On-the-Fly Native Ensemble
会议:NeurIPS 2018.
作者:xu lan, Xiatian Zhu, Shaogang Gong
链接:https://papers.nips.cc/paper/7980-knowledge-distillation-by-on-the-fly-native-ensemble.pdf
67. Learning to Specialize with Knowledge Distillation for Visual Question Answering
会议:NeurIPS 2018.
作者:Jonghwan Mun, Kimin Lee, Jinwoo Shin, Bohyung Han
链接:https://papers.nips.cc/paper/8031-learning-to-specialize-with-knowledge-distillation-for-visual-question-answering.pdf
68. WebChild 2.0 : Fine-Grained Commonsense Knowledge Distillation
会议:ACL 2017. System Demonstrations.
作者:Niket Tandon, Gerard de Melo, Gerhard Weikum
链接:https://www.aclweb.org/anthology/P17-4020.pdf
69. A Gift From Knowledge Distillation: Fast Optimization, Network Minimization and Transfer Learning
会议:CVPR 2017.
作者:Junho Yim, Donggyu Joo, Jihoon Bae, Junmo Kim
链接:https://openaccess.thecvf.com/content_cvpr_2017/papers/Yim_A_Gift_From_CVPR_2017_paper.pdf
70. Knowledge Distillation for Bilingual Dictionary Induction
会议:EMNLP 2017.
作者:Ndapandula Nakashole, Raphael Flauger
链接:https://www.aclweb.org/anthology/D17-1264.pdf
71. Visual Relationship Detection With Internal and External Linguistic Knowledge Distillation
会议:ICCV 2017.
作者:Ruichi Yu, Ang Li, Vlad I. Morariu, Larry S. Davis
链接:https://openaccess.thecvf.com/content_ICCV_2017/papers/Yu_Visual_Relationship_Detection_ICCV_2017_paper.pdf
72. Learning Efficient Object Detection Models with Knowledge Distillation
会议:NeurIPS 2017.
作者:Guobin Chen, Wongun Choi, Xiang Yu, Tony Han, Manmohan Chandraker
链接:https://papers.nips.cc/paper/6676-learning-efficient-object-detection-models-with-knowledge-distillation.pdf
73. Sequence-Level Knowledge Distillation
会议:EMNLP 2016.
作者:Yoon Kim, Alexander M. Rush
链接:https://www.aclweb.org/anthology/D16-1139.pdf