A Cost-based Optimizer for Gradient Descent Optimization

Preprint English OPEN
Kaoudi, Zoi; Quiané-Ruiz, Jorge-Arnulfo; Thirumuruganathan, Saravanan; Chawla, Sanjay; Agrawal, Divy;
(2017)

As the use of machine learning (ML) permeates into diverse application domains, there is an urgent need to support a declarative framework for ML. Ideally, a user will specify an ML task in a high-level and easy-to-use language and the framework will invoke the appropri... View more
  • References (25)
    25 references, page 1 of 3

    [1] Apache Mahout: Scalable machine learning and data mining. http://mahout.apache.org/.

    [2] Machine Learning Library (MLlib). http://spark.apache.org/mllib/.

    [3] M. Abadi et al. TensorFlow: A System for Large-Scale Machine Learning. In OSDI, pages 265{283, 2016.

    [4] D. Agrawal et al. Rheem: Enabling multi-platform task execution. In SIGMOD, 2016.

    [5] D. Agrawal et al. Road to Freedom in Big Data Analytics. In EDBT, 2016.

    [6] S. Ben-David and S. Shalev-Shwartz. Understanding Machine Learning: From Theory to Algorithms. Cambridge University Press, 2014.

    [7] D. P. Bertsekas. Nonlinear programming. chapter 1.3. Athena scienti c Belmont, 1999.

    [8] M. Boehm et al. SystemML: Declarative Machine Learning on Spark. PVLDB, 9(13):1425{1436, 2016.

    [9] M. Boehm, S. Tatikonda, B. Reinwald, P. Sen, Y. Tian, D. R. Burdick, and S. Vaithyanathan. Hybrid Parallelization Strategies for Large-scale Machine Learning in SystemML. PVLDB, 7(7):553{564, Mar. 2014.

    [10] L. Bottou. Stochastic Gradient Descent Tricks. In Neural Networks: Tricks of the Trade. 2012.

  • Related Research Results (1)
    Inferred by OpenAIRE
    software
    rheem software on GitHub
    72%
  • Metrics
Share - Bookmark