Simplifying Neural Machine Translation with Addition-Subtraction Twin-Gated Recurrent Networks
Zhang, Biao; Xiong, Deyi; Su, Jinsong; Lin, Qian; Zhang, Huiji;
Subject: Computer Science - Computation and Language
In this paper, we propose an additionsubtraction twin-gated recurrent network (ATR) to simplify neural machine translation. The recurrent units of ATR are heavily simplified to have the smallest number of weight matrices among units of all existing gated RNNs. With the ... View more
M. Antonino and M. Federico. 2018. Deep Neural Machine Translation with Weakly-Recurrent Units. ArXiv e-prints.
Dzmitry Bahdanau, Kyunghyun Cho, and Yoshua Bengio. 2015. Neural machine translation by jointly learning to align and translate. In Proc. of ICLR.
Samuel R. Bowman, Gabor Angeli, Christopher Potts, and Christopher D. Manning. 2015. A large annotated corpus for learning natural language inference. In Proc. of EMNLP. Association for Computational Linguistics.
Samuel R. Bowman, Jon Gauthier, Abhinav Rastogi, Raghav Gupta, Christopher D. Manning, and Christopher Potts. 2016. A fast unified model for parsing and sentence understanding. In Proc. of ACL, pages 1466-1477.
James Bradbury, Stephen Merity, Caiming Xiong, and Richard Socher. 2016. Quasi-recurrent neural networks. CoRR, abs/1611.01576.
Christian Buck, Kenneth Heafield, and Bas van Ooyen. 2014. N-gram counts and language models from the common crawl. In Proc. of LREC, pages 3579- 3584, Reykjavik, Iceland.
Xinchi Chen, Xipeng Qiu, Chenxi Zhu, Pengfei Liu, and Xuanjing Huang. 2015. Long short-term memory neural networks for chinese word segmentation. In Proc. of EMNLP, pages 1197-1206.
Jianpeng Cheng, Li Dong, and Mirella Lapata. 2016. Long short-term memory-networks for machine reading. In Proc. of EMNLP, pages 551-561.
Junyoung Chung, C¸ aglar Gu¨lc¸ehre, KyungHyun Cho, and Yoshua Bengio. 2014. Empirical evaluation of gated recurrent neural networks on sequence modeling. CoRR.
Jeffrey L Elman. 1990. Finding structure in time. Cognitive science, 14(2):179-211.