Actions
  • shareshare
  • link
  • cite
  • add
add
auto_awesome_motion View all 2 versions
Publication . Conference object . Preprint . 2018

Combining Advanced Methods in Japanese-Vietnamese Neural Machine Translation

Thi-Vinh Ngo; Thanh-Le Ha; Phuong-Thai Nguyen; Le-Minh Nguyen;
Open Access  
Published: 14 Dec 2018
Publisher: IEEE
Abstract
Neural machine translation (NMT) systems have recently obtained state-of-the art in many machine translation systems between popular language pairs because of the availability of data. For low-resourced language pairs, there are few researches in this field due to the lack of bilingual data. In this paper, we attempt to build the first NMT systems for a low-resourced language pairs:Japanese-Vietnamese. We have also shown significant improvements when combining advanced methods to reduce the adverse impacts of data sparsity and improve the quality of NMT systems. In addition, we proposed a variant of Byte-Pair Encoding algorithm to perform effective word segmentation for Vietnamese texts and alleviate the rare-word problem that persists in NMT systems.
Subjects by Vocabulary

Microsoft Academic Graph classification: Encoding (memory) Natural language processing computer.software_genre computer Decoding methods Vietnamese language.human_language language Text segmentation White spaces Field (computer science) Computer science Artificial intelligence business.industry business Knowledge engineering Machine translation

Subjects

Computer Science - Computation and Language

21 references, page 1 of 3

[1] D. Bahdanau, K. Cho, and Y. Bengio, “Neural Machine Translation by Jointly Learning to Align and Translate,” CoRR, vol. abs/1409.0473, 2014. [Online]. Available: http://arxiv.org/abs/1409.0473

[2] I. Sutskever, O. Vinyals, and Q. V. Le, “Sequence to sequence learning with neural networks,” in Technical Report arXiv:1409.3215 [cs.CL], Google. NIPS'2014, 2014.

[3] Y. Wu, M. Schuster, Z. Chen, Q. V. Le, M. Norouzi, W. Macherey, M. Krikun, Y. Cao, Q. Gao, K. Macherey et al., “Google's neural machine translation system: Bridging the gap between human and machine translation,” in Transactions of the Association for Computational Linguistics, vol. 5, pp. 339-351 2017.

[4] M. Cettolo, J. Niehues, S. Stüker, L. Bentivogli, R. Cattoni, and M. Federico, “The IWSLT 2016 Evaluation Campaign,” in Proceedings of the 13th International Workshop on Spoken Language Translation (IWSLT 2016), Seattle, WA, USA, 2016. [OpenAIRE]

[5] R. Sennrich, B. Haddow, and A. Birch, “Neural Machine Translation of Rare Words with Subword Units,” in Association for Computational Linguistics (ACL 2016), Berlin, Germany, August 2016.

[6] S. Hochreiter and J. Schmidhuber, “Long short-term memory,” Neural Comput., vol. 9, no. 8, pp. 1735-1780, Nov. 1997. [Online]. Available: http://dx.doi.org/10.1162/neco.1997.9.8.1735

[7] K. Cho, B. van Merrienboer, Ç. Gülçehre, F. Bougares, H. Schwenk, and Y. Bengio, “Learning Phrase Representations using RNN EncoderDecoder for Statistical Machine Translation,” in Proceedings of Eighth Workshop on Syntax, Semantics and Structure in Statistical Translation (SSST-8. Baltimore, ML, USA: Association for Computational Linguistics, Jule 2014.

[8] J. Gehring, M. Auli, D. Grangier, and Y. Dauphin, “A convolutional encoder model for neural machine translation,” in Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), vol. 1, 2017, pp. 123-135.

[9] A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, Ł. Kaiser, and I. Polosukhin, “Attention is all you need,” in Advances in Neural Information Processing Systems, 2017, pp. 6000-6010.

[10] P. Gage, “A new algorithm for data compression,” in C Users J., 12(2):23-38, February, 1994.