publication . Preprint . 2016

A Tour of TensorFlow

Goldsborough, Peter;
Open Access English
  • Published: 01 Oct 2016
Abstract
Deep learning is a branch of artificial intelligence employing deep neural network architectures that has significantly advanced the state-of-the-art in computer vision, speech recognition, natural language processing and other domains. In November 2015, Google released $\textit{TensorFlow}$, an open source deep learning software library for defining, training and deploying machine learning models. In this paper, we review TensorFlow and put it in context of modern deep learning concepts and software. We discuss its basic computational paradigms and distributed execution model, its programming interface as well as accompanying visualization toolkits. We then com...
Subjects
free text keywords: Computer Science - Learning
Download from
32 references, page 1 of 3

[1] Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature, vol. 521, no. 7553, pp. 436-444, May 2015. [Online]. Available: http://dx.doi.org/10.1038/nature14539

[2] V. Nair and G. E. Hinton, “Rectified linear units improve restricted boltzmann machines,” in Proceedings of the 27th International Conference on Machine Learning (ICML-10), J. FÃijrnkranz and T. Joachims, Eds. Omnipress, 2010, pp. 807-814. [Online]. Available: http://www.icml2010.org/papers/432.pdf

[3] N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever, and R. Salakhutdinov, “Dropout: A simple way to prevent neural networks from overfitting,” Journal of Machine Learning Research, vol. 15, pp. 1929-1958, 2014. [Online]. Available: http://jmlr.org/papers/v15/srivastava14a.html

[4] L. Rampasek and A. Goldenberg, “Tensorflow: Biology's gateway to deep learning?” Cell Systems, vol. 2, no. 1, pp. 12-14, 2016. [Online]. Available: http://dx.doi.org/10.1016/j.cels.2016.01.009 [OpenAIRE]

[6] R. Collobert, S. Bengio, and J. Marithoz, “Torch: A modular machine learning software library,” 2002.

[7] F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, D. Cournapeau, M. Brucher, M. Perrot, and E. Duchesnay, “Scikit-learn: Machine learning in python,” J. Mach. Learn. Res., vol. 12, pp. 2825-2830, Nov. 2011. [Online]. Available: http://dl.acm.org/citation.cfm?id=1953048.2078195 [OpenAIRE]

[8] M. Abadi, A. Agarwal, P. Barham, E. Brevdo, Z. Chen, C. Citro, G. S. Corrado, A. Davis, J. Dean, M. Devin, S. Ghemawat, I. Goodfellow, A. Harp, G. Irving, M. Isard, Y. Jia, R. Jozefowicz, L. Kaiser, M. Kudlur, J. Levenberg, D. Mané, R. Monga, S. Moore, D. Murray, C. Olah, M. Schuster, J. Shlens, B. Steiner, I. Sutskever, K. Talwar, P. Tucker, V. Vanhoucke, V. Vasudevan, F. Viégas, O. Vinyals, P. Warden, M. Wattenberg, M. Wicke, Y. Yu, and X. Zheng, “TensorFlow: Large-scale machine learning on heterogeneous systems,” 2015, software available from tensorflow.org. [Online]. Available: http://tensorflow.org/

[9] R. Kohavi, D. Sommerfield, and J. Dougherty, “Data mining using mscr; lscr; cscr;++ a machine learning library in c++,” in Tools with Artificial Intelligence, 1996., Proceedings Eighth IEEE International Conference on, Nov 1996, pp. 234-245.

[10] G. Bradski, “The opencv library,” Doctor Dobbs Journal, vol. 25, no. 11, pp. 120-126, 2000.

[11] C. R. de Souza, “A tutorial on principal component analysis with the accord.net framework,” CoRR, vol. abs/1210.7463, 2012. [Online]. Available: http://arxiv.org/abs/1210.7463

[12] A. Bifet, G. Holmes, B. Pfahringer, P. Kranen, H. Kremer, T. Jansen, and T. Seidl, “Moa: Massive online analysis, a framework for stream classification and clustering,” in Journal of Machine Learning Research (JMLR) Workshop and Conference Proceedings, Volume 11: Workshop on Applications of Pattern Analysis. Journal of Machine Learning Research, 2010, pp. 44-50.

[13] M. Zaharia, M. Chowdhury, M. J. Franklin, S. Shenker, and I. Stoica, “Spark: Cluster computing with working sets,” in Proceedings of the 2Nd USENIX Conference on Hot Topics in Cloud Computing, ser. HotCloud'10. Berkeley, CA, USA: USENIX Association, 2010, pp. 10-10. [Online]. Available: http://dl.acm.org/citation.cfm?id=1863103. 1863113

[14] X. Meng, J. K. Bradley, B. Yavuz, E. R. Sparks, S. Venkataraman, D. Liu, J. Freeman, D. B. Tsai, M. Amde, S. Owen, D. Xin, R. Xin, M. J. Franklin, R. Zadeh, M. Zaharia, and A. Talwalkar, “Mllib: Machine learning in apache spark,” CoRR, vol. abs/1505.06807, 2015. [Online]. Available: http://arxiv.org/abs/1505.06807

[15] Y. Jia, E. Shelhamer, J. Donahue, S. Karayev, J. Long, R. B. Girshick, S. Guadarrama, and T. Darrell, “Caffe: Convolutional architecture for fast feature embedding,” CoRR, vol. abs/1408.5093, 2014. [Online]. Available: http://arxiv.org/abs/1408.5093

[16] D. Murray, “Announcing tensorflow 0.8 âA˘ S¸ now with distributed computing support!” Google Research Blog, April 2016 (accessed May 22, 2016), http://googleresearch.blogspot.de/2016/04/announcingtensorflow-08-now-with.html.

32 references, page 1 of 3
Abstract
Deep learning is a branch of artificial intelligence employing deep neural network architectures that has significantly advanced the state-of-the-art in computer vision, speech recognition, natural language processing and other domains. In November 2015, Google released $\textit{TensorFlow}$, an open source deep learning software library for defining, training and deploying machine learning models. In this paper, we review TensorFlow and put it in context of modern deep learning concepts and software. We discuss its basic computational paradigms and distributed execution model, its programming interface as well as accompanying visualization toolkits. We then com...
Subjects
free text keywords: Computer Science - Learning
Download from
32 references, page 1 of 3

[1] Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature, vol. 521, no. 7553, pp. 436-444, May 2015. [Online]. Available: http://dx.doi.org/10.1038/nature14539

[2] V. Nair and G. E. Hinton, “Rectified linear units improve restricted boltzmann machines,” in Proceedings of the 27th International Conference on Machine Learning (ICML-10), J. FÃijrnkranz and T. Joachims, Eds. Omnipress, 2010, pp. 807-814. [Online]. Available: http://www.icml2010.org/papers/432.pdf

[3] N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever, and R. Salakhutdinov, “Dropout: A simple way to prevent neural networks from overfitting,” Journal of Machine Learning Research, vol. 15, pp. 1929-1958, 2014. [Online]. Available: http://jmlr.org/papers/v15/srivastava14a.html

[4] L. Rampasek and A. Goldenberg, “Tensorflow: Biology's gateway to deep learning?” Cell Systems, vol. 2, no. 1, pp. 12-14, 2016. [Online]. Available: http://dx.doi.org/10.1016/j.cels.2016.01.009 [OpenAIRE]

[6] R. Collobert, S. Bengio, and J. Marithoz, “Torch: A modular machine learning software library,” 2002.

[7] F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, D. Cournapeau, M. Brucher, M. Perrot, and E. Duchesnay, “Scikit-learn: Machine learning in python,” J. Mach. Learn. Res., vol. 12, pp. 2825-2830, Nov. 2011. [Online]. Available: http://dl.acm.org/citation.cfm?id=1953048.2078195 [OpenAIRE]

[8] M. Abadi, A. Agarwal, P. Barham, E. Brevdo, Z. Chen, C. Citro, G. S. Corrado, A. Davis, J. Dean, M. Devin, S. Ghemawat, I. Goodfellow, A. Harp, G. Irving, M. Isard, Y. Jia, R. Jozefowicz, L. Kaiser, M. Kudlur, J. Levenberg, D. Mané, R. Monga, S. Moore, D. Murray, C. Olah, M. Schuster, J. Shlens, B. Steiner, I. Sutskever, K. Talwar, P. Tucker, V. Vanhoucke, V. Vasudevan, F. Viégas, O. Vinyals, P. Warden, M. Wattenberg, M. Wicke, Y. Yu, and X. Zheng, “TensorFlow: Large-scale machine learning on heterogeneous systems,” 2015, software available from tensorflow.org. [Online]. Available: http://tensorflow.org/

[9] R. Kohavi, D. Sommerfield, and J. Dougherty, “Data mining using mscr; lscr; cscr;++ a machine learning library in c++,” in Tools with Artificial Intelligence, 1996., Proceedings Eighth IEEE International Conference on, Nov 1996, pp. 234-245.

[10] G. Bradski, “The opencv library,” Doctor Dobbs Journal, vol. 25, no. 11, pp. 120-126, 2000.

[11] C. R. de Souza, “A tutorial on principal component analysis with the accord.net framework,” CoRR, vol. abs/1210.7463, 2012. [Online]. Available: http://arxiv.org/abs/1210.7463

[12] A. Bifet, G. Holmes, B. Pfahringer, P. Kranen, H. Kremer, T. Jansen, and T. Seidl, “Moa: Massive online analysis, a framework for stream classification and clustering,” in Journal of Machine Learning Research (JMLR) Workshop and Conference Proceedings, Volume 11: Workshop on Applications of Pattern Analysis. Journal of Machine Learning Research, 2010, pp. 44-50.

[13] M. Zaharia, M. Chowdhury, M. J. Franklin, S. Shenker, and I. Stoica, “Spark: Cluster computing with working sets,” in Proceedings of the 2Nd USENIX Conference on Hot Topics in Cloud Computing, ser. HotCloud'10. Berkeley, CA, USA: USENIX Association, 2010, pp. 10-10. [Online]. Available: http://dl.acm.org/citation.cfm?id=1863103. 1863113

[14] X. Meng, J. K. Bradley, B. Yavuz, E. R. Sparks, S. Venkataraman, D. Liu, J. Freeman, D. B. Tsai, M. Amde, S. Owen, D. Xin, R. Xin, M. J. Franklin, R. Zadeh, M. Zaharia, and A. Talwalkar, “Mllib: Machine learning in apache spark,” CoRR, vol. abs/1505.06807, 2015. [Online]. Available: http://arxiv.org/abs/1505.06807

[15] Y. Jia, E. Shelhamer, J. Donahue, S. Karayev, J. Long, R. B. Girshick, S. Guadarrama, and T. Darrell, “Caffe: Convolutional architecture for fast feature embedding,” CoRR, vol. abs/1408.5093, 2014. [Online]. Available: http://arxiv.org/abs/1408.5093

[16] D. Murray, “Announcing tensorflow 0.8 âA˘ S¸ now with distributed computing support!” Google Research Blog, April 2016 (accessed May 22, 2016), http://googleresearch.blogspot.de/2016/04/announcingtensorflow-08-now-with.html.

32 references, page 1 of 3
Powered by OpenAIRE Research Graph
Any information missing or wrong?Report an Issue