publication . Conference object . Preprint . 2018

Combining Advanced Methods in Japanese-Vietnamese Neural Machine Translation

Thi-Vinh Ngo; Thanh-Le Ha; Phuong-Thai Nguyen; Le-Minh Nguyen;
Open Access
  • Published: 14 Dec 2018
  • Publisher: IEEE
Abstract
Neural machine translation (NMT) systems have recently obtained state-of-the art in many machine translation systems between popular language pairs because of the availability of data. For low-resourced language pairs, there are few researches in this field due to the lack of bilingual data. In this paper, we attempt to build the first NMT systems for a low-resourced language pairs:Japanese-Vietnamese. We have also shown significant improvements when combining advanced methods to reduce the adverse impacts of data sparsity and improve the quality of NMT systems. In addition, we proposed a variant of Byte-Pair Encoding algorithm to perform effective word segmentation for Vietnamese texts and alleviate the rare-word problem that persists in NMT systems.
Persistent Identifiers
Subjects
free text keywords: Computer Science - Computation and Language, Vietnamese, language.human_language, language, Knowledge engineering, Natural language processing, computer.software_genre, computer, White spaces, Artificial intelligence, business.industry, business, Text segmentation, Computer science, Machine translation, Decoding methods, Field (computer science), Encoding (memory)
Communities
Communities with gateway
OpenAIRE Connect image
21 references, page 1 of 2

[1] D. Bahdanau, K. Cho, and Y. Bengio, “Neural Machine Translation by Jointly Learning to Align and Translate,” CoRR, vol. abs/1409.0473, 2014. [Online]. Available: http://arxiv.org/abs/1409.0473

[2] I. Sutskever, O. Vinyals, and Q. V. Le, “Sequence to sequence learning with neural networks,” in Technical Report arXiv:1409.3215 [cs.CL], Google. NIPS'2014, 2014.

[3] Y. Wu, M. Schuster, Z. Chen, Q. V. Le, M. Norouzi, W. Macherey, M. Krikun, Y. Cao, Q. Gao, K. Macherey et al., “Google's neural machine translation system: Bridging the gap between human and machine translation,” in Transactions of the Association for Computational Linguistics, vol. 5, pp. 339-351 2017.

[4] M. Cettolo, J. Niehues, S. Stüker, L. Bentivogli, R. Cattoni, and M. Federico, “The IWSLT 2016 Evaluation Campaign,” in Proceedings of the 13th International Workshop on Spoken Language Translation (IWSLT 2016), Seattle, WA, USA, 2016. [OpenAIRE]

[5] R. Sennrich, B. Haddow, and A. Birch, “Neural Machine Translation of Rare Words with Subword Units,” in Association for Computational Linguistics (ACL 2016), Berlin, Germany, August 2016.

[6] S. Hochreiter and J. Schmidhuber, “Long short-term memory,” Neural Comput., vol. 9, no. 8, pp. 1735-1780, Nov. 1997. [Online]. Available: http://dx.doi.org/10.1162/neco.1997.9.8.1735

[7] K. Cho, B. van Merrienboer, Ç. Gülçehre, F. Bougares, H. Schwenk, and Y. Bengio, “Learning Phrase Representations using RNN EncoderDecoder for Statistical Machine Translation,” in Proceedings of Eighth Workshop on Syntax, Semantics and Structure in Statistical Translation (SSST-8. Baltimore, ML, USA: Association for Computational Linguistics, Jule 2014.

[8] J. Gehring, M. Auli, D. Grangier, and Y. Dauphin, “A convolutional encoder model for neural machine translation,” in Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), vol. 1, 2017, pp. 123-135.

[9] A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, Ł. Kaiser, and I. Polosukhin, “Attention is all you need,” in Advances in Neural Information Processing Systems, 2017, pp. 6000-6010.

[10] P. Gage, “A new algorithm for data compression,” in C Users J., 12(2):23-38, February, 1994.

[11] G. Neubig, Y. Nakata, and S. Mori, “Pointwise prediction for robust, adaptable japanese morphological analysis,” in Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies: short papers-Volume 2. Association for Computational Linguistics, 2011, pp. 529-533.

[12] R. Sennrich, B. Haddow, and A. Birch, “Improving Neural Machine Translation Models with Monolingual Data,” in Association for Computational Linguistics (ACL 2016), Berlin, Germany, August 2016.

[13] T.-L. Ha, J. Niehues, and A. Waibel, “Toward multilingual neural machine translation with universal encoder and decoder,” Proceedings of the 13th International Workshop on Spoken Language Translation (IWSLT 2016), 2016. [Online]. Available: http://arxiv.org/abs/1611.04798 [OpenAIRE]

[14] E. Cho, J. Niehues, T.-L. Ha, M. Sperber, M. Mediani, and A. Waibel, “Adaptation and Combination of NMT Systems: The KIT Translation Systems for IWSLT 2016,” in Proceedings of the 13th International Workshop on Spoken Language Translation (IWSLT 2016), Seattle, WA, USA, 2016.

[15] T.-L. Ha, J. Niehues, and A. Waibel, “Effective Strategies in Zero-Shot Neural Machine Translation,” Proceedings of the 14th International Workshop on Spoken Language Translation (IWSLT 2017), 2017. [Online]. Available: http://arxiv.org/abs/1711.07893 [OpenAIRE]

21 references, page 1 of 2
Abstract
Neural machine translation (NMT) systems have recently obtained state-of-the art in many machine translation systems between popular language pairs because of the availability of data. For low-resourced language pairs, there are few researches in this field due to the lack of bilingual data. In this paper, we attempt to build the first NMT systems for a low-resourced language pairs:Japanese-Vietnamese. We have also shown significant improvements when combining advanced methods to reduce the adverse impacts of data sparsity and improve the quality of NMT systems. In addition, we proposed a variant of Byte-Pair Encoding algorithm to perform effective word segmentation for Vietnamese texts and alleviate the rare-word problem that persists in NMT systems.
Persistent Identifiers
Subjects
free text keywords: Computer Science - Computation and Language, Vietnamese, language.human_language, language, Knowledge engineering, Natural language processing, computer.software_genre, computer, White spaces, Artificial intelligence, business.industry, business, Text segmentation, Computer science, Machine translation, Decoding methods, Field (computer science), Encoding (memory)
Communities
Communities with gateway
OpenAIRE Connect image
21 references, page 1 of 2

[1] D. Bahdanau, K. Cho, and Y. Bengio, “Neural Machine Translation by Jointly Learning to Align and Translate,” CoRR, vol. abs/1409.0473, 2014. [Online]. Available: http://arxiv.org/abs/1409.0473

[2] I. Sutskever, O. Vinyals, and Q. V. Le, “Sequence to sequence learning with neural networks,” in Technical Report arXiv:1409.3215 [cs.CL], Google. NIPS'2014, 2014.

[3] Y. Wu, M. Schuster, Z. Chen, Q. V. Le, M. Norouzi, W. Macherey, M. Krikun, Y. Cao, Q. Gao, K. Macherey et al., “Google's neural machine translation system: Bridging the gap between human and machine translation,” in Transactions of the Association for Computational Linguistics, vol. 5, pp. 339-351 2017.

[4] M. Cettolo, J. Niehues, S. Stüker, L. Bentivogli, R. Cattoni, and M. Federico, “The IWSLT 2016 Evaluation Campaign,” in Proceedings of the 13th International Workshop on Spoken Language Translation (IWSLT 2016), Seattle, WA, USA, 2016. [OpenAIRE]

[5] R. Sennrich, B. Haddow, and A. Birch, “Neural Machine Translation of Rare Words with Subword Units,” in Association for Computational Linguistics (ACL 2016), Berlin, Germany, August 2016.

[6] S. Hochreiter and J. Schmidhuber, “Long short-term memory,” Neural Comput., vol. 9, no. 8, pp. 1735-1780, Nov. 1997. [Online]. Available: http://dx.doi.org/10.1162/neco.1997.9.8.1735

[7] K. Cho, B. van Merrienboer, Ç. Gülçehre, F. Bougares, H. Schwenk, and Y. Bengio, “Learning Phrase Representations using RNN EncoderDecoder for Statistical Machine Translation,” in Proceedings of Eighth Workshop on Syntax, Semantics and Structure in Statistical Translation (SSST-8. Baltimore, ML, USA: Association for Computational Linguistics, Jule 2014.

[8] J. Gehring, M. Auli, D. Grangier, and Y. Dauphin, “A convolutional encoder model for neural machine translation,” in Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), vol. 1, 2017, pp. 123-135.

[9] A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, Ł. Kaiser, and I. Polosukhin, “Attention is all you need,” in Advances in Neural Information Processing Systems, 2017, pp. 6000-6010.

[10] P. Gage, “A new algorithm for data compression,” in C Users J., 12(2):23-38, February, 1994.

[11] G. Neubig, Y. Nakata, and S. Mori, “Pointwise prediction for robust, adaptable japanese morphological analysis,” in Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies: short papers-Volume 2. Association for Computational Linguistics, 2011, pp. 529-533.

[12] R. Sennrich, B. Haddow, and A. Birch, “Improving Neural Machine Translation Models with Monolingual Data,” in Association for Computational Linguistics (ACL 2016), Berlin, Germany, August 2016.

[13] T.-L. Ha, J. Niehues, and A. Waibel, “Toward multilingual neural machine translation with universal encoder and decoder,” Proceedings of the 13th International Workshop on Spoken Language Translation (IWSLT 2016), 2016. [Online]. Available: http://arxiv.org/abs/1611.04798 [OpenAIRE]

[14] E. Cho, J. Niehues, T.-L. Ha, M. Sperber, M. Mediani, and A. Waibel, “Adaptation and Combination of NMT Systems: The KIT Translation Systems for IWSLT 2016,” in Proceedings of the 13th International Workshop on Spoken Language Translation (IWSLT 2016), Seattle, WA, USA, 2016.

[15] T.-L. Ha, J. Niehues, and A. Waibel, “Effective Strategies in Zero-Shot Neural Machine Translation,” Proceedings of the 14th International Workshop on Spoken Language Translation (IWSLT 2017), 2017. [Online]. Available: http://arxiv.org/abs/1711.07893 [OpenAIRE]

21 references, page 1 of 2
Any information missing or wrong?Report an Issue