publication . Preprint . 2017

ZOOpt: Toolbox for Derivative-Free Optimization

Liu, Yu-Ren; Hu, Yi-Qi; Qian, Hong; Yu, Yang; Qian, Chao;
Open Access English
  • Published: 31 Dec 2017
Abstract
Recent advances of derivative-free optimization allow efficient approximating the global optimal solutions of sophisticated functions, such as functions with many local optima, non-differentiable and non-continuous functions. This article describes the ZOOpt (https://github.com/eyounx/ZOOpt) toolbox that provides efficient derivative-free solvers and are designed easy to use. ZOOpt provides a Python package for single-thread optimization, and a light-weighted distributed version with the help of the Julia language for Python described functions. ZOOpt toolbox particularly focuses on optimization problems in machine learning, addressing high-dimensional, noisy, a...
Subjects
free text keywords: Computer Science - Learning, Statistics - Machine Learning
Related Organizations
Download from
20 references, page 1 of 2

J. S. Bergstra, R. Bardenet, Y. Bengio, and B. Ke´gl. Algorithms for hyper-parameter optimization. In Advances in Neural Information Processing Systems (NIPS'11), pages 2546-2554, 2011. [OpenAIRE]

M. Feurer, A. Klein, K. Eggensperger, J. T. Springenberg, M. Blum, and F. Hutter. Efficient and robust automated machine learning. In Advances in Neural Information Processing Systems 28, pages 2962-2970, Montreal, Canada, 2015.

F.-A. Fortin, F.-M. D. Rainville, M.-A. Gardner, M. Parizeau, and C. Gagne´. DEAP: Evolutionary algorithms made easy. Journal of Machine Learning Research, 13:21712175, 2012. [OpenAIRE]

N. Hansen, S. D. Mu¨ller, and P. Koumoutsakos. Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evolutionary Computation, 11 (1):1-18, 2003.

Y.-Q. Hu, H. Qian, and Y. Yu. Sequential classification-based optimization for direct policy search. In Proceedings of the 31st AAAI Conference on Artificial Intelligence, pages 2029-2035, San Francisco, CA, 2017.

F. Hutter, H. H. Hoos, and K. Leyton-Brown. Sequential model-based optimization for general algorithm configuration. LION, 5:507-523, 2011.

K. Kawaguchi, L. P. Kaelbling, and T. Lozano-Pe´rez. Bayesian optimization with exponential convergence. In Advances in Neural Information Processing Systems, pages 2809-2817, 2015.

R. Martinez-Cantin. Bayesopt: A bayesian optimization library for nonlinear optimization, experimental design and bandits. Journal of Machine Learning Research, 15:3735-3739, 2014. [OpenAIRE]

R. Munos. From bandits to Monte-Carlo Tree Search: The optimistic principle applied to optimization and planning. Foundations and Trends in Machine Learning, 7(1):1-130, 2014.

C. Qian, Y. Yu, and Z.-H. Zhou. Subset selection by pareto optimization. In Advances in Neural Information Processing Systems 28, pages 1765-1773, Montreal, Canada, 2015.

C. Qian, J.-C. Shi, Y. Yu, K. Tang, and Z.-H. Zhou. Subset selection under noise. In Advances in Neural Information Processing Systems 30, pages 3563-3573, Long Beach, CA, 2017.

H. Qian, Y.-Q. Hu, and Y. Yu. Derivative-free optimization of high-dimensional non-convex functions by sequential random embeddings. In Proceedings of the 25th International Joint Conference on Artificial Intelligence, pages 1946-1952, New York, NY, 2016.

E. Real, S. Moore, A. Selle, S. Saxena, Y. L. Suematsu, J. Tan, Q. V. Le, and A. Kurakin. Large-scale evolution of image classifiers. In Proceedings of the 34th International Conference on Machine Learning, pages 2902-2911, Sydney, Australia, 2017.

T. Salimans, J. Ho, X. Chen, and I. Sutskever. Evolution strategies as a scalable alternative to reinforcement learning. CoRR, abs/1703.03864, 2017. [OpenAIRE]

B. Shahriari, K. Swersky, Z. Wang, R. P. Adams, and N. de Freitas. Taking the human out of the loop: A review of bayesian optimization. Proceedings of the IEEE, 104(1):148-175, 2016.

20 references, page 1 of 2
Abstract
Recent advances of derivative-free optimization allow efficient approximating the global optimal solutions of sophisticated functions, such as functions with many local optima, non-differentiable and non-continuous functions. This article describes the ZOOpt (https://github.com/eyounx/ZOOpt) toolbox that provides efficient derivative-free solvers and are designed easy to use. ZOOpt provides a Python package for single-thread optimization, and a light-weighted distributed version with the help of the Julia language for Python described functions. ZOOpt toolbox particularly focuses on optimization problems in machine learning, addressing high-dimensional, noisy, a...
Subjects
free text keywords: Computer Science - Learning, Statistics - Machine Learning
Related Organizations
Download from
20 references, page 1 of 2

J. S. Bergstra, R. Bardenet, Y. Bengio, and B. Ke´gl. Algorithms for hyper-parameter optimization. In Advances in Neural Information Processing Systems (NIPS'11), pages 2546-2554, 2011. [OpenAIRE]

M. Feurer, A. Klein, K. Eggensperger, J. T. Springenberg, M. Blum, and F. Hutter. Efficient and robust automated machine learning. In Advances in Neural Information Processing Systems 28, pages 2962-2970, Montreal, Canada, 2015.

F.-A. Fortin, F.-M. D. Rainville, M.-A. Gardner, M. Parizeau, and C. Gagne´. DEAP: Evolutionary algorithms made easy. Journal of Machine Learning Research, 13:21712175, 2012. [OpenAIRE]

N. Hansen, S. D. Mu¨ller, and P. Koumoutsakos. Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evolutionary Computation, 11 (1):1-18, 2003.

Y.-Q. Hu, H. Qian, and Y. Yu. Sequential classification-based optimization for direct policy search. In Proceedings of the 31st AAAI Conference on Artificial Intelligence, pages 2029-2035, San Francisco, CA, 2017.

F. Hutter, H. H. Hoos, and K. Leyton-Brown. Sequential model-based optimization for general algorithm configuration. LION, 5:507-523, 2011.

K. Kawaguchi, L. P. Kaelbling, and T. Lozano-Pe´rez. Bayesian optimization with exponential convergence. In Advances in Neural Information Processing Systems, pages 2809-2817, 2015.

R. Martinez-Cantin. Bayesopt: A bayesian optimization library for nonlinear optimization, experimental design and bandits. Journal of Machine Learning Research, 15:3735-3739, 2014. [OpenAIRE]

R. Munos. From bandits to Monte-Carlo Tree Search: The optimistic principle applied to optimization and planning. Foundations and Trends in Machine Learning, 7(1):1-130, 2014.

C. Qian, Y. Yu, and Z.-H. Zhou. Subset selection by pareto optimization. In Advances in Neural Information Processing Systems 28, pages 1765-1773, Montreal, Canada, 2015.

C. Qian, J.-C. Shi, Y. Yu, K. Tang, and Z.-H. Zhou. Subset selection under noise. In Advances in Neural Information Processing Systems 30, pages 3563-3573, Long Beach, CA, 2017.

H. Qian, Y.-Q. Hu, and Y. Yu. Derivative-free optimization of high-dimensional non-convex functions by sequential random embeddings. In Proceedings of the 25th International Joint Conference on Artificial Intelligence, pages 1946-1952, New York, NY, 2016.

E. Real, S. Moore, A. Selle, S. Saxena, Y. L. Suematsu, J. Tan, Q. V. Le, and A. Kurakin. Large-scale evolution of image classifiers. In Proceedings of the 34th International Conference on Machine Learning, pages 2902-2911, Sydney, Australia, 2017.

T. Salimans, J. Ho, X. Chen, and I. Sutskever. Evolution strategies as a scalable alternative to reinforcement learning. CoRR, abs/1703.03864, 2017. [OpenAIRE]

B. Shahriari, K. Swersky, Z. Wang, R. P. Adams, and N. de Freitas. Taking the human out of the loop: A review of bayesian optimization. Proceedings of the IEEE, 104(1):148-175, 2016.

20 references, page 1 of 2
Powered by OpenAIRE Open Research Graph
Any information missing or wrong?Report an Issue