咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Prior knowledge integration fo... 收藏
arXiv

Prior knowledge integration for neural machine translation using posterior regularization

作     者:Zhang, Jiacheng Liu, Yang Luan, Huanbo Xu, Jingfang Sun, Maosong 

作者机构:State Key Laboratory of Intelligent Technology and Systems Tsinghua National Laboratory for Information Science and Technology Department of Computer Science and Technology Tsinghua University Beijing Jiangsu Collaborative Innovation Center for Language Competence Jiangsu Sogou Inc. Beijing 

出 版 物:《arXiv》 (arXiv)

年 卷 期:2018年

核心收录:

主  题:Neural machine translation 

摘      要:Although neural machine translation has made significant progress recently, how to integrate multiple overlapping, arbitrary prior knowledge sources remains a challenge. In this work, we propose to use posterior regularization to provide a general framework for integrating prior knowledge into neural machine translation. We represent prior knowledge sources as features in a log-linear model, which guides the learning process of the neural translation model. Experiments on Chinese-English translation show that our approach leads to significant improvements. 1. Copyright © 2018, The Authors. All rights reserved.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分