H. Hassan, A. Aue, C. Chen, V. Chowdhary, J. Clark, C. Federmann, X. Huang, M. JunczysDowmunt, W. Lewis, M. Li, S. Liu, T. Liu, R. Luo, A. Menezes, T. Qin, F. Seide, X. Tan, F. Tian, L. Wu, S. Wu, Y. Xia, D. Zhang, Z. Zhang, and M. Zhou, “Achieving human parity on automatic chinese to english news translation,” CoRR, vol. abs/1803.05567, 2018. [Online]. Available: http://arxiv.org/abs/1803.05567 [OpenAIRE]
 S. La¨ubli, R. Sennrich, and M. Volk, “Has machine translation achieved human parity? A case for document-level evaluation,” in Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, Brussels, Belgium, 2018, pp. 4791-4796. [Online]. Available: https://aclanthology. info/papers/D18-1512/d18-1512
 K. Murray and D. Chiang, “Correcting length bias in neural machine translation,” in Proceedings of the Third Conference on Machine Translation: Research Papers, 2018, pp. 212-223.
 X. Shi, K. Knight, and D. Yuret, “Why neural translations are the right length,” in Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, 2016, pp. 2278-2282.
 A. M. Rush, S. Chopra, and J. Weston, “A neural attention model for abstractive sentence summarization,” arXiv preprint arXiv:1509.00685, 2015.
 Y. Kikuchi, G. Neubig, R. Sasano, H. Takamura, and M. Okumura, “Controlling output length in neural encoder-decoders,” arXiv preprint arXiv:1609.09552, 2016. [OpenAIRE]
 A. Fan, D. Grangier, and M. Auli, “Controllable abstractive summarization,” arXiv preprint arXiv:1711.05217, 2017.
 Y. Liu, Z. Luo, and K. Zhu, “Controlling length in abstractive summarization using a convolutional neural network,” in Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, 2018, pp. 4110-4119.
 T. Sho and O. Naoaki, “Positional encoding to control output sequence length,” in Proceedings of the HLT-NAACL 2019, 2019. [Online]. Available: http://arxiv.org/abs/1904.07418 [OpenAIRE]
 A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, Ł. Kaiser, and I. Polosukhin, “Attention is all you need,” in Advances in Neural Information Processing Systems, 2017, pp. 6000-6010.
 M.-T. Luong, H. Pham, and C. D. Manning, “Effective approaches to attention-based neural machine translation,” arXiv preprint arXiv:1508.04025, 2015.
 C. Szegedy, W. Liu, Y. Jia, P. Sermanet, S. Reed, D. Anguelov, D. Erhan, V. Vanhoucke, and A. Rabinovich, “Going deeper with convolutions,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2015, pp. 1-9.
 M. Johnson, M. Schuster, Q. V. Le, M. Krikun, Y. Wu, Z. Chen, N. Thorat, F. Vie´gas, M. Wattenberg, G. Corrado et al., “Google's multilingual neural machine translation system: Enabling zero-shot translation,” arXiv preprint arXiv:1611.04558, 2016.
 T.-L. Ha, J. Niehues, and A. Waibel, “Toward multilingual neural machine translation with universal encoder and decoder,” arXiv preprint arXiv:1611.04798, 2016. [OpenAIRE]
 R. Sennrich, B. Haddow, and A. Birch, “Neural machine translation of rare words with subword units,” arXiv preprint arXiv:1508.07909, 2015. [OpenAIRE]