Cross-Lingual Dependency Parsing with Late Decoding for Truly Low-Resource Languages

Preprint English OPEN
Schlichtkrull, Michael Sejr; Søgaard, Anders;
  • Subject: Computer Science - Computation and Language

In cross-lingual dependency annotation projection, information is often lost during transfer because of early decoding. We present an end-to-end graph-based neural network dependency parser that can be trained to reproduce matrices of edge scores, which can be directly ... View more
  • References (32)
    32 references, page 1 of 4

    Zˇ eljko Agic´, Anders Johannsen, Barbara Plank, He´ctor Mart´ınez Alonso, Natalie Schluter, and Anders Søgaard. 2016. Multilingual projection for parsing truly low-resource languages. Transactions of the Association for Computational Linguistics, 4.

    Theodore Bluche, Christopher Kermorvant, and Jerome Louradour. 2015. Where to apply dropout in recurrent neural networks for handwriting recognition? In Document Analysis and Recognition (ICDAR), 2015 13th International Conference on, pages 681-685. IEEE.

    Long Duong, Trevor Cohn, Steven Bird, and Paul Cook. 2015. Low resource dependency parsing: Cross-lingual parameter sharing in a neural network parser. In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Short Papers), pages 845- 850. Association for Computational Linguistics.

    Jack Edmonds. 1968. Optimum branchings. In Mathematics and the Decision Sciences, Part 1, pages 335-345. American Mathematical Society.

    Erick R Fonseca, Avenida Trabalhador Sa˜o-carlense, and Sandra M Alu´ısio. 2015. A deep architecture for non-projective dependency parsing. In Proceedings of the 2015 NAACL-HLT Workshop on Vector Space Modeling for NLP, pages 56-61. Association for Computational Linguistics.

    Xavier Glorot and Yoshua Bengio. 2010. Understanding the difficulty of training deep feedforward neural networks. In Proceedings of the 2010 International conference on Artificial Intelligence and Statistics, pages 249-256. Society for Artificial Intelligence and Statistics.

    Alex Graves, Santiago Ferna´ndez, and Ju¨rgen Schmidhuber. 2007. Multi-dimensional recurrent neural networks. arXiv preprint arXiv:0705.2011.

    Jiang Guo, Wanxiang Che, David Yarowsky, Haifeng Wang, and Ting Liu. 2015. Cross-lingual dependency parsing based on distributed representations. In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing, pages 1234-1244. Association for Computational Linguistics.

    Sepp Hochreiter, Yoshua Bengio, Paolo Frasconi, and Ju¨rgen Schmidhuber. 2001. Gradient flow in recurrent nets: the difficulty of learning long-term dependencies. In A Field Guide to Dynamic Recurrent Neural Networks. IEEE press.

    Rebecca Hwa, Philip Resnik, Amy Weinberg, Clara Cabezas, and Okan Kolak. 2005. Bootstrapping parsers via syntactic projection across parallel texts. Natural language engineering, 11(03):311-325.

  • Related Research Results (1)
    Inferred by OpenAIRE
    Tensor-LSTM software on GitHub
  • Related Organizations (9)
  • Metrics
Share - Bookmark