Reproducing and learning new algebraic operations on word embeddings using genetic programming

Preprint English OPEN
Santana, Roberto;
(2017)
  • Subject: Computer Science - Computation and Language
    arxiv: Computer Science::Computation and Language (Computational Linguistics and Natural Language and Speech Processing)

Word-vector representations associate a high dimensional real-vector to every word from a corpus. Recently, neural-network based methods have been proposed for learning this representation from large corpora. This type of word-to-vector embedding is able to keep, in the... View more
  • References (31)
    31 references, page 1 of 4

    [1] Y. Bengio, R. Ducharme, P. Vincent, and C. Jauvin. A neural probabilistic language model. Journal of machine learning research, 3(Feb):1137-1155, 2003.

    [2] U. Bhowan, M. Johnston, M. Zhang, and X. Yao. Evolving diverse ensembles using genetic programming for classification with unbalanced data. IEEE Transactions on Evolutionary Computation, 17(3):368-386, 2013.

    [3] W. Blacoe and M. Lapata. A comparison of vector-based representations for semantic composition. In Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, pages 546-556. Association for Computational Linguistics, 2012.

    [4] P. A. Bosman and E. D. De Jong. Learning probabilistic tree grammars for genetic programming. In International Conference on Parallel Problem Solving from Nature, pages 192-201. Springer, 2004.

    [6] R. Cummins and C. O'Riordan. An analysis of the solution space for genetically programmed term-weighting schemes in information retrieval. In P. S. P. M. D. Bell, editor, 17th Artificial Intelligence and Cognitive Science Conference (AICS 2006), Queen's University, Belfast, 2006.

    [7] H. J. Escalante, M. A. Garc´ıa-Limo´ n, A. Morales-Reyes, M. Graff, M. Montes-y Go´ mez, E. F. Morales, and J. Mart´ınez-Carranza. Term-weighting learning via genetic programming for text classification. Knowledge-Based Systems, 83:176-189, 2015.

    [8] F.-A. Fortin, D. Rainville, M.-A. G. Gardner, M. Parizeau, C. Gagne´, et al. DEAP: Evolutionary algorithms made easy. The Journal of Machine Learning Research, 13(1):2171-2175, 2012.

    [9] E. Grefenstette and M. Sadrzadeh. Experimental support for a categorical compositional distributional model of meaning. In Proceedings of the Conference on Empirical Methods in Natural Language Processing, pages 1394-1404. Association for Computational Linguistics, 2011.

    [10] M. Iqbal, W. Browne, and M. Zhang. Reusing building blocks of extracted knowledge to solve complex, large-scale Boolean problems. Evolutionary Computation, IEEE Transactions on, 18(4):465-480, Aug 2014.

    [11] M. Iyyer, J. L. Boyd-Graber, L. M. B. Claudino, R. Socher, and H. Daume´ III. A neural network for factoid question answering over paragraphs. In Empirical Methods in Natural Language Processing (EMNLP), pages 633-644, 2014.

  • Related Research Results (2)
  • Similar Research Results (3)
  • Metrics
Share - Bookmark