【专知荟萃14】机器翻译 Machine Translation知识资料全集(入门/进阶/综述/视频/代码/专家,附PDF下载)

点击上方“专知”关注获取专业AI知识!


【导读】主题荟萃知识是专知的核心功能之一,为用户提供AI领域系统性的知识学习服务。主题荟萃为用户提供全网关于该主题的精华(Awesome)知识资料收录整理,使得AI从业者便捷学习和解决工作问题!在专知人工智能主题知识树基础上,主题荟萃由专业人工编辑和算法工具辅助协作完成,并保持动态更新!另外欢迎对此创作主题荟萃感兴趣的同学,请加入我们专知AI创作者计划,共创共赢! 今天专知为大家呈送第十四篇专知主题荟萃-机器翻译知识资料大全集荟萃 (入门/进阶/综述/视频/代码/专家等),请大家查看!专知访问www.zhuanzhi.ai,  或关注微信公众号后台回复" 专知"进入专知,搜索主题“机器翻译”查看。此外,我们也提供该文网页桌面手机端(www.zhuanzhi.ai)完整访问,可直接点击访问收录链接地址,以及pdf版下载链接,请文章末尾查看!此为初始版本,请大家指正补充,欢迎在后台留言!欢迎大家分享转发~


  • 机器翻译 Machine Translation 专知荟萃

    • 入门学习

    • 进阶论文

    • 综述

    • Tutorial

    • 视频教程

    • 代码

    • 领域专家


入门学习

  1. CIPS青工委学术专栏第9期 | 神经机器翻译 http://www.cipsc.org.cn/qngw/?p=953

  2. 基于深度学习的机器翻译研究进展 http://www.caai.cn/index.php?s=/Home/Article/qikandetail/year/2016/month/02.html

  3. 35张PPT带你深入浅出认识,深度学习的机器翻译 (也有视频教程)http://mp.weixin.qq.com/s/pnJDuXxw2VI9zEWgNivKdw

  4. Kyunghyun Cho对神经机器翻译的介绍 [https://devblogs.nvidia.com/parallelforall/introduction-neural-machine-translation-with-gpus/] [http://devblogs.nvidia.com/parallelforall/introduction-neural-machine-translation-gpus-part-2/] [https://devblogs.nvidia.com/parallelforall/introduction-neural-machine-translation-gpus-part-3/]

  5. 神经网络机器翻译Neural Machine Translation(1): Encoder-Decoder Architecture (2): Attention Mechanism [http://blog.csdn.net/u011414416/article/details/51048994] [http://blog.csdn.net/u011414416/article/details/51057789]

  6. TensorFlow 神经机器翻译教程 [https://github.com/tensorflow/nmt]

  7. AMTA2016上Rico Sennrich的讲习班 http://statmt.org/mtma16/uploads/mtma16-neural.pdf


进阶论文

1997

  1. Neco, R. P., & Forcada, M. L. (1997, June). Asynchronous translations with recurrent neural nets. In Neural Networks, 1997., International Conference on (Vol. 4, pp. 2535-2540). IEEE.
    [http://ieeexplore.ieee.org/document/614693/]


2003

  1. Bengio, Y., Ducharme, R., Vincent, P., & Jauvin, C. (2003). A neural probabilistic language model. Journal of machine learning research, 3(Feb), 1137-1155.
    [http://www.jmlr.org/papers/volume3/bengio03a/bengio03a.pdf]

  2. Pascanu, R., Mikolov, T., & Bengio, Y. (2013, February). On the difficulty of training recurrent neural networks. In International Conference on Machine Learning (pp. 1310-1318).
    [http://arxiv.org/abs/1211.5063]


2010

  1. Sudoh, K., Duh, K., Tsukada, H., Hirao, T., & Nagata, M. (2010, July). Divide and translate: improving long distance reordering in statistical machine translation. In Proceedings of the Joint Fifth Workshop on Statistical Machine Translation and MetricsMATR (pp. 418-427). Association for Computational Linguistics.
    [https://dl.acm.org/citation.cfm?id=1868912]


2013

  1. Kalchbrenner, N., & Blunsom, P. (2013, October). Recurrent Continuous Translation Models. In EMNLP (Vol. 3, No. 39, p. 413).
    [https://www.researchgate.net/publication/289758666_Recurrent_continuous_translation_models]


2014

  1. Mnih, V., Heess, N., & Graves, A. (2014). Recurrent models of visual attention. In Advances in neural information processing systems (pp. 2204-2212)
    [http://arxiv.org/abs/1406.6247]

  2. Sutskever, I., Vinyals, O., & Le, Q. V. Sequence to sequence learning with neural networks. In Advances in neural information processing systems(pp. 3104-3112).
    [https://arxiv.org/abs/1409.3215]

  3. Cho, K., Van Merriënboer, B., Gulcehre, C., Bahdanau, D., Bougares, F., Schwenk, H., & Bengio, Y. . Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078.
    [http://arxiv.org/abs/1406.1078]

  4. Bahdanau, D., Cho, K., & Bengio, Y. (2014). Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473.
    [https://arxiv.org/abs/1409.0473]

  5. Jean, S., Cho, K., Memisevic, R., & Bengio, Y. (2014). On using very large target vocabulary for neural machine translation. arXiv preprint arXiv:1412.2007.
    [http://arxiv.org/abs/1412.2007]

  6. Luong, M. T., Sutskever, I., Le, Q. V., Vinyals, O., & Zaremba, W. (2014). Addressing the rare word problem in neural machine translation. arXiv preprint arXiv:1410.8206.
    [http://arxiv.org/abs/1410.8206]


2015

  1. Sennrich, R., Haddow, B., & Birch, A. (2015). Improving neural machine translation models with monolingual data. arXiv preprint arXiv:1511.06709.
    [http://arxiv.org/abs/1511.06709]

  2. Dong, D., Wu, H., He, W., Yu, D., & Wang, H. (2015). Multi-Task Learning for Multiple Language Translation. In ACL (1) (pp. 1723-1732).
    [http://www.anthology.aclweb.org/P/P15/P15-1166.pdf]

  3. Shen, S., Cheng, Y., He, Z., He, W., Wu, H., Sun, M., & Liu, Y. (2015). Minimum risk training for neural machine translation. arXiv preprint arXiv:1512.02433.
    [https://arxiv.org/abs/1512.02433]

  4. Bojar O, Chatterjee R, Federmann C, et al. Findings of the 2015 Workshop on Statistical Machine Translation[C]. Tech Workshop on Statistical Machine Translation,2015.
    [https://www-test.pure.ed.ac.uk/portal/files/23139669/W15_3001.pdfv]


2016

  1. Facebook:Convolutional Sequence to Sequence Learning Jonas Gehring, Michael Auli, David Grangier, Denis Yarats, Yann N. Dauphin
    [https://arxiv.org/abs/1705.03122]

  2. Wu, Y., Schuster, M., Chen, Z., Le, Q. V., Norouzi, M., Macherey, W., … & Klingner, J. (2016). Google's neural machine translation system: Bridging the gap between human and machine translation. arXiv preprint arXiv:1609.08144.
    [https://arxiv.org/abs/1609.08144v1]

  3. Gehring, J., Auli, M., Grangier, D., & Dauphin, Y. N. (2016). A convolutional encoder model for neural machine translation. arXiv preprint arXiv:1611.02344.
    [https://arxiv.org/abs/1611.02344]

  4. Cheng, Y., Xu, W., He, Z., He, W., Wu, H., Sun, M., & Liu, Y. (2016). Semi-supervised learning for neural machine translation. arXiv preprint arXiv:1606.04596.
    [http://arxiv.org/abs/1606.04596]

  5. Wang, M., Lu, Z., Li, H., & Liu, Q. (2016). Memory-enhanced decoder for neural machine translation. arXiv preprint arXiv:1606.02003.
    [https://arxiv.org/abs/1606.02003]

  6. Sennrich, R., & Haddow, B. (2016). Linguistic input features improve neural machine translation. arXiv preprint arXiv:1606.02892.
    [http://arxiv.org/abs/1606.02892]

  7. Tu, Z., Lu, Z., Liu, Y., Liu, X., & Li, H. (2016). Modeling coverage for neural machine translation. arXiv preprint arXiv:1601.04811.
    [http://arxiv.org/abs/1601.04811]

  8. Cohn, T., Hoang, C. D. V., Vymolova, E., Yao, K., Dyer, C., & Haffari, G. (2016). Incorporating structural alignment biases into an attentional neural translation model. arXiv preprint arXiv:1601.01085.
    [http://www.m-mitchell.com/NAACL-2016/NAACL-HLT2016/pdf/N16-1102.pdf]

  9. Hitschler, J., Schamoni, S., & Riezler, S. (2016). Multimodal pivots for image caption translation. arXiv preprint arXiv:1601.03916.
    [https://arxiv.org/abs/1601.03916]

  10. Junczys-Dowmunt, M., Dwojak, T., & Hoang, H. (2016). Is neural machine translation ready for deployment. A case study on, 30.
    [https://arxiv.org/abs/1610.01108]

  11. Johnson, M., Schuster, M., Le, Q. V., Krikun, M., Wu, Y., Chen, Z., … & Hughes, M. (2016). Google』s multilingual neural machine translation system: enabling zero-shot translation. arXiv preprint arXiv:1611.04558.
    [https://arxiv.org/abs/1611.04558]

  12. Bartolome, Diego, and Gema Ramirez.「Beyond the Hype of Neural Machine Translation,」MIT Technology Review (May 23, 2016), bit.ly/2aG4bvR.
    [https://www.slideshare.net/TAUS/beyond-the-hype-of-neural-machine-translation-diego-bartolome-tauyou-and-gema-ramirez-prompsit-language-engineering]

  13. Crego, J., Kim, J., Klein, G., Rebollo, A., Yang, K., Senellart, J., … & Enoue, S. (2016). SYSTRAN』s Pure Neural Machine Translation Systems. arXiv preprint arXiv:1610.05540.
    [https://arxiv.org/abs/1610.05540]


2017

  1. Google:Attention Is All You Need Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin
    [http://arxiv.org/abs/1706.03762]

  2. Microsoft: Neural Phrase-based Machine Translation Po-Sen Huang, Chong Wang, Dengyong Zhou, Li Deng
    [http://arxiv.org/abs/1706.05565]

  3. A Neural Network for Machine Translation, at Production Scale. (2017). Research Blog. Retrieved 26 July 2017, from [https://research.googleblog.com/2016/09/a-neural-network-for-machine.html]
    [http://www.googblogs.com/a-neural-network-for-machine-translation-at-production-scale/]

  4. Gehring, J., Auli, M., Grangier, D., Yarats, D., & Dauphin, Y. N. (2017). Convolutional Sequence to Sequence Learning. arXiv preprint arXiv:1705.03122.
    [https://arxiv.org/abs/1705.03122]

  5. Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., … & Polosukhin, I. (2017). Attention Is All You Need. arXiv preprint arXiv:1706.03762.
    [https://arxiv.org/abs/1706.03762]

  6. Train Neural Machine Translation Models with Sockeye | Amazon Web Services. (2017). Amazon Web Services. Retrieved 26 July 2017, from
    [https://aws.amazon.com/blogs/ai/train-neural-machine-translation-models-with-sockeye/]

  7. Dandekar, N. (2017). How does an attention mechanism work in deep learning for natural language processing?. Quora. Retrieved 26 July 2017, from
    [https://www.quora.com/How-does-an-attention-mechanism-work-in-deep-learning-for-natural-language-processing]

  8. Microsoft Translator launching Neural Network based translations for all its speech languages. (2017). Translator. Retrieved 27 July 2017, from
    [https://blogs.msdn.microsoft.com/translation/2016/11/15/microsoft-translator-launching-neural-network-based-translations-for-all-its-speech-languages/]

  9. ACL 2017. (2017). Accepted Papers, Demonstrations and TACL Articles for ACL 2017. [online] Available at:
    [https://chairs-blog.acl2017.org/2017/04/05/accepted-papers-and-demonstrations/] [Accessed 7 Aug. 2017].


综述

  1. 神经机器翻译前沿进展 清华大学刘洋老师 [http://crad.ict.ac.cn/CN/abstract/abstract3422.shtml]

  2. 斯坦福Thang Luong的博士论文 [https://github.com/lmthang/thesis/blob/master/thesis.pdf]

  3. Deep Neural Networks in Machine Translation: An Overview [http://www.nlpr.ia.ac.cn/cip/ZongPublications/2015/IEEE-Zhang-8-5.pdf]


Tutorial

  1. ACL 2016 Tutorial -- Neural Machine Translation Lmthang在ACL 2016上所做的tutorial [http://nlp.stanford.edu/projects/nmt/Luong-Cho-Manning-NMT-ACL2016-v4.pdf]

  2. 神经机器翻译前沿进展 由清华大学的刘洋老师在第十二届全国机器翻译讨论会(2016年8月在乌鲁木齐举办)上做的报告 [http://nlp.csai.tsinghua.edu.cn/~ly/talks/cwmt2016_ly_v3_160826.pptx]

  3. CCL2016 | T1B: 深度学习与机器翻译 第十五届全国计算语言学会议(CCL 2016) [http://www.cips-cl.org/static/CCL2016/tutorialsT1B.html]

  4. Neural Machine Translation [http://statmt.org/mtma16/uploads/mtma16-neural.pdf]

  5. ACL2016上Thang Luong,Kyunghyun Cho和Christopher Manning的讲习班 [https://sites.google.com/site/acl16nmt/]

  6. Kyunghyun Cho的talk : New Territory of Machine Translation,主要是讲cho自己所关注的NMT问题 [https://drive.google.com/file/d/0B16RwCMQqrtdRVotWlQ3T2ZXTmM/view]


视频教程

  1. cs224d neural machine translation [https://cs224d.stanford.edu/lectures/CS224d-Lecture15.pdf] [https://www.youtube.com/watch?v=IxQtK2SjWWM&index=11&list=PL3FW7Lu3i5Jsnh1rnUwq_TcylNr7EkRe6\]

  2. 清华大学刘洋:基于深度学习的机器翻译

  3. A Practical Guide to Neural Machine Translation [https://www.youtube.com/watch?v=vxibD6VaOfI]


代码

  1. seq2seq 实现了谷歌提出的seq2seq模型,基于TensorFlow框架开发。 [https://github.com/tensorflow/tensorflow]

  2. nmt.matlab 由Stanford的博士Lmthang开源的,代码由Matlab所写。[https://github.com/lmthang/nmt.matlab]

  3. GroundHog 实现了基于注意力机制的神经机器翻译模型,由Bengio研究组,基于Theano框架开发。 [https://github.com/lisa-groundhog/GroundHog]

  4. NMT-Coverage 实现了基于覆盖率的神经机器翻译模型,由华为诺亚方舟实验室李航团队,基于Theano框架开发。 [https://github.com/tuzhaopeng/NMT-Coverage]

  5. OpenNMT 由哈佛大学NLP组开源的神经机器翻译工具包,基于Torch框架开发,达到工业级程度。 [http://opennmt.net/]

  6. EUREKA-MangoNMT 由中科院自动化所的张家俊老师开发,采用C++。 [https://github.com/jiajunzhangnlp/EUREKA-MangoNMT]

  7. dl4mt-tutorial 基于Theano框架开发。 [https://github.com/nyu-dl/dl4mt-tutorial]


领域专家

  1. Université de Montréal: Yoshua Bengio,Dzmitry Bahdanau

  2. New York University: KyungHyun Cho

  3. Stanford University: Manning,Lmthang

  4. Google: IIya Sutskever,Quoc V.Le

  5. 中科院计算所: 刘群

  6. 东北大学: 朱靖波

  7. 清华大学: 刘洋

  8. 中科院自动化所: 宗成庆,张家俊

  9. 苏州大学: 熊德意,张民

  10. 华为-诺亚方舟: 李航,涂兆鹏

  11. 百度: 王海峰,吴华


初步版本,水平有限,有错误或者不完善的地方,欢迎大家提建议和补充,会一直保持更新,敬请关注http://www.zhuanzhi.ai 和关注专知公众号,获取最新AI相关知识


欢迎转发到你的微信群和朋友圈,分享专业AI知识!



特别提示-专知机器翻译主题:

获取完整版查看,最新更新知识资料请PC登录www.zhuanzhi.ai或者点击阅读原文注册登录,顶端搜索“机器翻译” 主题,查看获得专知荟萃全集知识等资料,直接PC端访问体验更佳!如下图所示~



此外,请关注专知公众号(扫一扫最下面专知二维码,或者点击上方蓝色专知),

  • 后台回复“机器翻译”或者“MT” 就可以在手机端获取专知机器翻译知识资料pdf下载和查看链接地址,直接打开荟萃资料的链接地址~~


往期专知荟萃知识资料全集获取(关注本公众号-专知,获取下载链接),请查看:

【专知荟萃01】深度学习知识资料大全集(入门/进阶/论文/代码/数据/综述/领域专家等)(附pdf下载)

【专知荟萃02】自然语言处理NLP知识资料大全集(入门/进阶/论文/Toolkit/数据/综述/专家等)(附pdf下载)

【专知荟萃03】知识图谱KG知识资料全集(入门/进阶/论文/代码/数据/综述/专家等)(附pdf下载)

【专知荟萃04】自动问答QA知识资料全集(入门/进阶/论文/代码/数据/综述/专家等)(附pdf下载)

【专知荟萃05】聊天机器人Chatbot知识资料全集(入门/进阶/论文/软件/数据/专家等)(附pdf下载)

【专知荟萃06】计算机视觉CV知识资料大全集(入门/进阶/论文/课程/会议/专家等)(附pdf下载)

【专知荟萃07】自动文摘AS知识资料全集(入门/进阶/代码/数据/专家等)(附pdf下载)

【专知荟萃08】图像描述生成Image Caption知识资料全集(入门/进阶/论文/综述/视频/专家等)

【专知荟萃09】目标检测知识资料全集(入门/进阶/论文/综述/视频/代码等)

【专知荟萃10】推荐系统RS知识资料全集(入门/进阶/论文/综述/视频/代码等)

【专知荟萃11】GAN生成式对抗网络知识资料全集(理论/报告/教程/综述/代码等)

【专知荟萃12】信息检索 Information Retrieval 知识资料全集(入门/进阶/综述/代码/专家,附PDF下载)

【专知荟萃13】工业学术界用户画像 User Profile 实用知识资料全集(入门/进阶/竞赛/论文/PPT,附PDF下载)


请扫描小助手,加入专知人工智能群,交流分享~

-END-

欢迎使用专知

专知,一个新的认知方式!专注在人工智能领域为AI从业者提供专业可信的知识分发服务, 包括主题定制、主题链路、搜索发现等服务,帮你又好又快找到所需知识。


使用方法>>访问www.zhuanzhi.ai, 或点击文章下方“阅读原文”即可访问专知

中国科学院自动化研究所专知团队

@2017 专知

专 · 知

关注我们的公众号,获取最新关于专知以及人工智能的资讯、技术、算法、深度干货等内容。扫一扫下方关注我们的微信公众号。


点击“阅读原文”,使用专知


展开全文
Top
微信扫码咨询专知VIP会员