BLEU: a Method for Automatic Evaluation of Machine Translation. Kishore Papineni, Salim Roukos, Todd Ward, and Wei-Jing Zhu. IBM T. J. Watson Research Center.
BLEU: a method for automatic evaluation of machine translation. Authors: Kishore Papineni.
Bleu: a Method for Automatic Evaluation of Machine Translation · K. Papineni, Salim Roukos, +1 author. Wei-Jing Zhu · Published in Annual Meeting of the… 6 July ...
People also ask
What is the BLEU evaluation method?
How is BLEU score calculated for machine translation?
What is considered a good BLEU score?
What are the methods of machine translation evaluation?
BLEU: a Method for Automatic Evaluation of Machine Translation. Kishore Papineni, Salim Roukos, Todd Ward, and Wei-Jing Zhu. IBM T. J. Watson Research Center.
BLEU: a Method for Automatic Evaluation of Machine Translation. October 2002. DOI:10.3115/1073083.1073135. Authors: Kishore Papineni · Kishore ...
Bleu: a Method for Automatic Evaluation of Machine Translation. Human evaluations of machine translation are extensive but expensive. Human evaluations can ...
BLEU: a method for automatic evaluation of machine translation (PDF). ACL-2002: 40th Annual meeting of the Association for Computational Linguistics. pp ...
The closer a machine translation is to a professional human translation, the better it is. BLEU: a Method for Automatic Evaluation of Machine Translation.
... {BLEU: a Method for Automatic Evaluation of Machine Translation}, booktitle = {}, year = {2002}, pages = {311--318} } @inproceedings{lin-och-2004-orange ...
Oct 22, 2011 · BLEU: a Method for Automatic Evaluation of Machine Translation. Proceedings of 40th Annual Meeting of the Association for Computational ...
People also search for