【Knowledge Distillation】知识蒸馏论文整理

【Knowledge Distillation】知识蒸馏论文整理

1. Towards Cross-Modality Medical Image Segmentation with Online Mutual Knowledge Distillation
会议:AAAI 2020. AAAI Technical Track: Applications.
作者:Kang Li, Lequan Yu, Shujun Wang, Pheng-Ann Heng
链接:aaai.org/ojs/index.php/

2. Online Knowledge Distillation with Diverse Peers
会议:AAAI 2020. AAAI Technical Track: Machine Learning.
作者:Defang Chen, Jian-Ping Mei, Can Wang, Yan Feng, Chun Chen
链接:aaai.org/ojs/index.php/

3. Towards Oracle Knowledge Distillation with Neural Architecture Search
会议:AAAI 2020. AAAI Technical Track: Machine Learning.
作者:Minsoo Kang, Jonghwan Mun, Bohyung Han
链接:aaai.org/ojs/index.php/

4. Improved Knowledge Distillation via Teacher Assistant
会议:AAAI 2020. AAAI Technical Track: Machine Learning.
作者:Seyed-Iman Mirzadeh, Mehrdad Farajtabar, Ang Li, Nir Levine, Akihiro Matsukawa, Hassan Ghasemzadeh
链接:aaai.org/ojs/index.php/

5. Knowledge Distillation from Internal Representations
会议:AAAI 2020. AAAI Technical Track: Natural Language Processing.
作者:Gustavo Aguilar, Yuan Ling, Yu Zhang, Benjamin Yao, Xing Fan, Chenlei Guo
链接:aaai.org/ojs/index.php/

6. Ultrafast Video Attention Prediction with Coupled Knowledge Distillation
会议:AAAI 2020. AAAI Technical Track: Vision.
作者:Kui Fu, Peipei Shi, Yafei Song, Shiming Ge, Xiangju Lu, Jia Li
链接:aaai.org/ojs/index.php/

7. Uncertainty-Aware Multi-Shot Knowledge Distillation for Image-Based Object Re-Identification
会议:AAAI 2020. AAAI Technical Track: Vision.
作者:Xin Jin, Cuiling Lan, Wenjun Zeng, Zhibo Chen
链接:aaai.org/ojs/index.php/

8. Structure-Level Knowledge Distillation For Multilingual Sequence Labeling
会议:ACL 2020.
作者:Xinyu Wang, Yong Jiang, Nguyen Bach, Tao Wang, Fei Huang, Kewei Tu
链接:aclweb.org/anthology/20

9. Knowledge Distillation for Multilingual Unsupervised Neural Machine Translation
会议:ACL 2020.
作者:Haipeng Sun, Rui Wang, Kehai Chen, Masao Utiyama, Eiichiro Sumita, Tiejun Zhao
链接:aclweb.org/anthology/20

10. TextBrewer: An Open-Source Knowledge Distillation Toolkit for Natural Language Processing
会议:ACL 2020. System Demonstrations.
作者:Ziqing Yang, Yiming Cui, Zhipeng Chen, Wanxiang Che, Ting Liu, Shijin Wang, Guoping Hu
链接:aclweb.org/anthology/20

11. End-to-End Speech-Translation with Knowledge Distillation: FBK@IWSLT2020
会议:ACL 2020. the 17th International Conference on Spoken Language Translation.
作者:Marco Gaido, Mattia A. Di Gangi, Matteo Negri, Marco Turchi
链接:aclweb.org/anthology/20

12. Exploring the Limits of Simple Learners in Knowledge Distillation for Document Classification with DocBERT
会议:ACL 2020. the 5th Workshop on Representation Learning for NLP.
作者:Ashutosh Adhikari, Achyudh Ram, Raphael Tang, William L. Hamilton, Jimmy Lin
链接:aclweb.org/anthology/20

13. Neural Networks Are More Productive Teachers Than Human Raters: Active Mixup for Data-Efficient Knowledge Distillation From a Blackbox Model
会议:CVPR 2020.
作者:Dongdong Wang, Yandong Li, Liqiang Wang, Boqing Gong
链接:openaccess.thecvf.com/c

14. Creating Something From Nothing: Unsupervised Knowledge Distillation for Cross-Modal Hashing
会议:CVPR 2020.
作者:Hengtong Hu, Lingxi Xie, Richang Hong, Qi Tian
链接:openaccess.thecvf.com/c

15. Heterogeneous Knowledge Distillation Using Information Flow Modeling
会议:CVPR 2020.
作者:Nikolaos Passalis, Maria Tzelepi, Anastasios Tefas
链接:openaccess.thecvf.com/c

16. Revisiting Knowledge Distillation via Label Smoothing Regularization
会议:CVPR 2020.
作者:Li Yuan, Francis EH Tay, Guilin Li, Tao Wang, Jiashi Feng
链接:openaccess.thecvf.com/c

17. Block-Wisely Supervised Neural Architecture Search With Knowledge Distillation
会议:CVPR 2020.
作者:Changlin Li, Jiefeng Peng, Liuchun Yuan, Guangrun Wang, Xiaodan Liang, Liang Lin, Xiaojun Chang
链接:openaccess.thecvf.com/c

18. Few Sample Knowledge Distillation for Efficient Network Compression
会议:CVPR 2020.
作者:Tianhong Li, Jianguo Li, Zhuang Liu, Changshui Zhang
链接:openaccess.thecvf.com/c

19. Online Knowledge Distillation via Collaborative Learning
会议:CVPR 2020.
作者:Qiushan Guo, Xinjiang Wang, Yichao Wu, Zhipeng Yu, Ding Liang, Xiaolin Hu, Ping Luo
链接:openaccess.thecvf.com/c

20. Circumventing Outliers of AutoAugment with Knowledge Distillation
会议:ECCV 2020.
作者:Longhui Wei, An Xiao, Lingxi Xie, Xiaopeng Zhang, Xin Chen, Qi Tian
链接:ecva.net/papers/eccv_20

21. Learning From Multiple Experts: Self-paced Knowledge Distillation for Long-tailed Classification
会议:ECCV 2020.
作者:Liuyu Xiang, Guiguang Ding, Jungong Han
链接:ecva.net/papers/eccv_20

22. Knowledge Distillation Meets Self-Supervision
会议:ECCV 2020.
作者:Guodong Xu, Ziwei Liu, Xiaoxiao Li, Chen Change Loy
链接:ecva.net/papers/eccv_20

23. Robust Re-Identification by Multiple Views Knowledge Distillation
会议:ECCV 2020.
作者:Angelo Porrello, Luca Bergamini, Simone Calderara
链接:ecva.net/papers/eccv_20

24. Local Correlation Consistency for Knowledge Distillation
会议:ECCV 2020.
作者:Xiaojie Li, Jianlong Wu, Hongyu Fang, Yue Liao, Fei Wang, Chen Qian
链接:ecva.net/papers/eccv_20

25. AMLN: Adversarial-based Mutual Learning Network for Online Knowledge Distillation
会议:ECCV 2020.
作者:Xiaobing Zhang, Shijian Lu, Haigang Gong, Zhipeng Luo, Ming Liu
链接:ecva.net/papers/eccv_20

26. Differentiable Feature Aggregation Search for Knowledge Distillation
会议:ECCV 2020.
作者:Yushuo Guan, Pengyu Zhao, Bingxuan Wang, Yuanxing Zhang, Cong Yao, Kaigui Bian, Jian Tang
链接:ecva.net/papers/eccv_20

27. Online Ensemble Model Compression using Knowledge Distillation
会议:ECCV 2020.
作者:Devesh Walawalkar, Zhiqiang Shen, Marios Savvides
链接:ecva.net/papers/eccv_20

28. Exclusivity-Consistency Regularized Knowledge Distillation for Face Recognition
会议:ECCV 2020.
作者:Xiaobo Wang, Tianyu Fu, Shengcai Liao, Shuo Wang, Zhen Lei, Tao Mei
链接:ecva.net/papers/eccv_20

29. Feature Normalized Knowledge Distillation for Image Classification
会议:ECCV 2020.
作者:Kunran Xu, Lai Rui, Yishi Li, Lin Gu
链接:ecva.net/papers/eccv_20

30. Weight Decay Scheduling and Knowledge Distillation for Active Learning
会议:ECCV 2020.
作者:Juseung Yun, Byungjoo Kim, Junmo Kim
链接:ecva.net/papers/eccv_20

31. Semantic Relation Preserving Knowledge Distillation for Image-to-Image Translation
会议:ECCV 2020.
作者:Zeqi Li, Ruowei Jiang,, Parham Aarabi
链接:ecva.net/papers/eccv_20

32. Interpretable Foreground Object Search As Knowledge Distillation
会议:ECCV 2020.
作者:Boren Li, Po-Yu Zhuang, Jian Gu, Mingyang Li, Ping Tan
链接:ecva.net/papers/eccv_20

33. Improving Knowledge Distillation via Category Structure
会议:ECCV 2020.
作者:Zailiang Chen, Xianxian Zheng, Hailan Shen, Ziyang Zeng, Yukun Zhou, Rongchang Zhao
链接:ecva.net/papers/eccv_20

34. Making Monolingual Sentence Embeddings Multilingual Using Knowledge Distillation
会议:EMNLP 2020. Long Paper.
作者:Nils Reimers, Iryna Gurevych
链接:aclweb.org/anthology/20

35. Lifelong Language Knowledge Distillation
会议:EMNLP 2020. Long Paper.
作者:Yung-Sung Chuang, Shang-Yu Su, Yun-Nung Chen
链接:aclweb.org/anthology/20

36. Autoregressive Knowledge Distillation through Imitation Learning
会议:EMNLP 2020. Long Paper.
作者:Alexander Lin, Jeremy Wohlwend, Howard Chen, Tao Lei
链接:aclweb.org/anthology/20

37. Improving Neural Topic Models Using Knowledge Distillation
会议:EMNLP 2020. Long Paper.
作者:Alexander Miserlis Hoyle, Pranav Goel, Philip Resnik
链接:aclweb.org/anthology/20

38. Why Skip If You Can Combine: A Simple Knowledge Distillation Technique for Intermediate Layers
会议:EMNLP 2020. Short Paper.
作者:Yimeng Wu, Peyman Passban, Mehdi Rezagholizadeh, Qun Liu
链接:aclweb.org/anthology/20

39. Understanding Knowledge Distillation in Non-autoregressive Machine Translation
会议:ICLR 2020.
作者:Chunting Zhou, Jiatao Gu, Graham Neubig
链接:openreview.net/pdf?

40. P-KDGAN: Progressive Knowledge Distillation with GANs for One-class Novelty Detection
会议:IJCAI 2020.
作者:Zhiwei Zhang, Shifeng Chen, Lei Sun
链接:ijcai.org/proceedings/2

41. Private Model Compression via Knowledge Distillation
会议:AAAI 2019. AAAI Technical Track: Applications.
作者:Ji Wang, Weidong Bao, Lichao Sun, Xiaomin Zhu, Bokai Cao, Philip S. Yu
链接:aaai.org/ojs/index.php/

42. Knowledge Distillation with Adversarial Samples Supporting Decision Boundary
会议:AAAI 2019. AAAI Technical Track: Machine Learning.
作者:Byeongho Heo, Minsik Lee, Sangdoo Yun, Jin Young Choi
链接:aaai.org/ojs/index.php/

43. Exploiting the Ground-Truth: An Adversarial Imitation Based Knowledge Distillation Approach for Event Detection
会议:AAAI 2019. AAAI Technical Track: Natural Language Processing.
作者:Jian Liu, Yubo Chen, Kang Liu
链接:aaai.org/ojs/index.php/

44. Scalable Syntax-Aware Language Models Using Knowledge Distillation
会议:ACL 2019.
作者:Adhiguna Kuncoro, Chris Dyer, Laura Rimell, Stephen Clark, Phil Blunsom
链接:aclweb.org/anthology/P1

45. PANLP at MEDIQA 2019: Pre-trained Language Models, Transfer Learning and Knowledge Distillation
会议:ACL 2019. the 18th BioNLP Workshop and Shared Task.
作者:Wei Zhu, Xiaofeng Zhou, Keqiang Wang, Xun Luo, Xiepeng Li, Yuan Ni, Guotong Xie
链接:aclweb.org/anthology/W1

46. Structured Knowledge Distillation for Semantic Segmentation
会议:CVPR 2019.
作者:Yifan Liu, Ke Chen, Chris Liu, Zengchang Qin, Zhenbo Luo, Jingdong Wang
链接:openaccess.thecvf.com/c

47. Relational Knowledge Distillation
会议:CVPR 2019.
作者:Wonpyo Park, Dongju Kim, Yan Lu, Minsu Cho
链接:openaccess.thecvf.com/c

48. Knowledge Distillation via Instance Relationship Graph
会议:CVPR 2019.
作者:Yufan Liu, Jiajiong Cao, Bing Li, Chunfeng Yuan, Weiming Hu, Yangxi Li, Yunqiang Duan
链接:openaccess.thecvf.com/c

49. Refine and Distill: Exploiting Cycle-Inconsistency and Knowledge Distillation for Unsupervised Monocular Depth Estimation
会议:CVPR 2019.
作者:Andrea Pilzer, Stephane Lathuiliere, Nicu Sebe, Elisa Ricci
链接:openaccess.thecvf.com/c

50. Patient Knowledge Distillation for BERT Model Compression
会议:EMNLP 2019.
作者:Siqi Sun, Yu Cheng, Zhe Gan, Jingjing Liu
链接:aclweb.org/anthology/D1

51. Weakly Supervised Cross-lingual Semantic Relation Classification via Knowledge Distillation
会议:EMNLP 2019.
作者:Yogarshi Vyas, Marine Carpuat
链接:aclweb.org/anthology/D1

52. Natural Language Generation for Effective Knowledge Distillation
会议:EMNLP 2019. the 2nd Workshop on Deep Learning Approaches for Low-Resource NLP (DeepLo 2019).
作者:Raphael Tang, Yao Lu, Jimmy Lin
链接:aclweb.org/anthology/D1

53. Knowledge Distillation via Route Constrained Optimization
会议:ICCV 2019.
作者:Xiao Jin, Baoyun Peng, Yichao Wu, Yu Liu, Jiaheng Liu, Ding Liang, Junjie Yan, Xiaolin Hu
链接:openaccess.thecvf.com/c

54. Similarity-Preserving Knowledge Distillation
会议:ICCV 2019.
作者:Frederick Tung, Greg Mori
链接:openaccess.thecvf.com/c

55. On the Efficacy of Knowledge Distillation
会议:ICCV 2019.
作者:Jang Hyun Cho, Bharath Hariharan
链接:openaccess.thecvf.com/c

56. Correlation Congruence for Knowledge Distillation
会议:ICCV 2019.
作者:Baoyun Peng, Xiao Jin, Jiaheng Liu, Dongsheng Li, Yichao Wu, Yu Liu, Shunfeng Zhou, Zhaoning Zhang
链接:openaccess.thecvf.com/c

57. Multilingual Neural Machine Translation with Knowledge Distillation
会议:ICLR 2019.
作者:Xu Tan, Yi Ren, Di He, Tao Qin, Zhou Zhao, Tie-Yan Liu
链接:openreview.net/pdf?

58. Zero-Shot Knowledge Distillation in Deep Networks
会议:ICML 2019.
作者:Gaurav Kumar Nayak, Konda Reddy Mopuri, Vaisakh Shaj, Venkatesh Babu Radhakrishnan, Anirban Chakraborty
链接:proceedings.mlr.press/v

59. Towards Understanding Knowledge Distillation
会议:ICML 2019.
作者:Mary Phuong, Christoph Lampert
链接:proceedings.mlr.press/v

60. Pedestrian Attribute Recognition by Joint Visual-semantic Reasoning and Knowledge Distillation
会议:IJCAI 2019.
作者:Qiaozhe Li, Xin Zhao, Ran He, Kaiqi Huang
链接:ijcai.org/proceedings/2

61. On Knowledge distillation from complex networks for response prediction
会议:NAACL 2019.
作者:Siddhartha Arora, Mitesh M. Khapra, Harish G. Ramaswamy
链接:aclweb.org/anthology/N1

62. Self-supervised Knowledge Distillation Using Singular Value Decomposition
会议:ECCV 2018.
作者:Seung Hyun Lee, Dae Ha Kim, Byung Cheol Song
链接:openaccess.thecvf.com/c

63. Apprentice: Using Knowledge Distillation Techniques To Improve Low-Precision Network Accuracy
会议:ICLR 2018.
作者:Asit K. Mishra, Debbie Marr
链接:openreview.net/pdf?

64. Progressive Blockwise Knowledge Distillation for Neural Network Acceleration
会议:IJCAI 2018.
作者:Hui Wang, Hanbin Zhao, Xi Li, Xu Tan
链接:ijcai.org/proceedings/2

65. KDGAN: Knowledge Distillation with Generative Adversarial Networks
会议:NeurIPS 2018.
作者:Xiaojie Wang, Rui Zhang, Yu Sun, Jianzhong Qi
链接:papers.nips.cc/paper/73

66. Knowledge Distillation by On-the-Fly Native Ensemble
会议:NeurIPS 2018.
作者:xu lan, Xiatian Zhu, Shaogang Gong
链接:papers.nips.cc/paper/79

67. Learning to Specialize with Knowledge Distillation for Visual Question Answering
会议:NeurIPS 2018.
作者:Jonghwan Mun, Kimin Lee, Jinwoo Shin, Bohyung Han
链接:papers.nips.cc/paper/80

68. WebChild 2.0 : Fine-Grained Commonsense Knowledge Distillation
会议:ACL 2017. System Demonstrations.
作者:Niket Tandon, Gerard de Melo, Gerhard Weikum
链接:aclweb.org/anthology/P1

69. A Gift From Knowledge Distillation: Fast Optimization, Network Minimization and Transfer Learning
会议:CVPR 2017.
作者:Junho Yim, Donggyu Joo, Jihoon Bae, Junmo Kim
链接:openaccess.thecvf.com/c

70. Knowledge Distillation for Bilingual Dictionary Induction
会议:EMNLP 2017.
作者:Ndapandula Nakashole, Raphael Flauger
链接:aclweb.org/anthology/D1

71. Visual Relationship Detection With Internal and External Linguistic Knowledge Distillation
会议:ICCV 2017.
作者:Ruichi Yu, Ang Li, Vlad I. Morariu, Larry S. Davis
链接:openaccess.thecvf.com/c

72. Learning Efficient Object Detection Models with Knowledge Distillation
会议:NeurIPS 2017.
作者:Guobin Chen, Wongun Choi, Xiang Yu, Tony Han, Manmohan Chandraker
链接:papers.nips.cc/paper/66

73. Sequence-Level Knowledge Distillation
会议:EMNLP 2016.
作者:Yoon Kim, Alexander M. Rush
链接:aclweb.org/anthology/D1

发布于 2021-05-12 15:38