publication . Preprint . 2015

A New Sampling Technique for Tensors

Bhojanapalli, Srinadh; Sanghavi, Sujay;
Open Access English
  • Published: 17 Feb 2015
In this paper we propose new techniques to sample arbitrary third-order tensors, with an objective of speeding up tensor algorithms that have recently gained popularity in machine learning. Our main contribution is a new way to select, in a biased random way, only $O(n^{1.5}/\epsilon^2)$ of the possible $n^3$ elements while still achieving each of the three goals: \\ {\em (a) tensor sparsification}: for a tensor that has to be formed from arbitrary samples, compute very few elements to get a good spectral approximation, and for arbitrary orthogonal tensors {\em (b) tensor completion:} recover an exactly low-rank tensor from a small number of samples via alternat...
free text keywords: Statistics - Machine Learning, Computer Science - Data Structures and Algorithms, Computer Science - Information Theory, Computer Science - Learning
Related Organizations
Download from
39 references, page 1 of 3

Acar, Evrim, Aykut-Bingol, Canan, Bingol, Haluk, Bro, Rasmus, and Yener, Bu¨lent. Multiway analysis of epilepsy tensors. Bioinformatics, 23(13):i10-i18, 2007.

Acar, Evrim, Dunlavy, Daniel M, Kolda, Tamara G, and Mørup, Morten. Scalable tensor factorizations for incomplete data. Chemometrics and Intelligent Laboratory Systems, 106(1):41-56, 2011.

Achlioptas, Dimitris and McSherry, Frank. Fast computation of low rank matrix approximations. In Proceedings of the thirty-third annual ACM symposium on Theory of computing, pp. 611-618. ACM, 2001.

Achlioptas, Dimitris, Karnin, Zohar, and Liberty, Edo. Near-optimal distributions for data matrix sampling. Advances in Neural Information Processing Systems, 73, 2013.

Anandkumar, Anima, Foster, Dean P, Hsu, Daniel, Kakade, Sham, and Liu, Yi-Kai. A spectral algorithm for latent dirichlet allocation. In NIPS, pp. 926-934, 2012.

Anandkumar, Animashree, Ge, Rong, Hsu, Daniel, Kakade, Sham M, and Telgarsky, Matus. Tensor decompositions for learning latent variable models. The Journal of Machine Learning Research, 15(1):2773-2832, 2014a.

Anandkumar, Animashree, Ge, Rong, and Janzamin, Majid. Guaranteed non-orthogonal tensor decomposition via alternating rank-1 updates. arXiv preprint arXiv:1402.5180, 2014b.

Barak, Boaz and Moitra, Ankur. Tensor prediction, rademacher complexity and random 3-xor. arXiv preprint arXiv:1501.06521, 2015.

Bhojanapalli, Srinadh, Jain, Prateek, and Sanghavi, Sujay. Tighter low-rank approximation via sampling the leveraged element. In Proceedings of the Twenty-Sixth Annual ACM-SIAM Symposium on Discrete Algorithms, pp. 902-920, 2015. doi: 10.1137/1.9781611973730.62.

Cand`es, Emmanuel J and Recht, Benjamin. Exact matrix completion via convex optimization. Foundations of Computational mathematics, 9(6):717-772, 2009. [OpenAIRE]

Cand`es, Emmanuel J and Tao, Terence. The power of convex relaxation: Near-optimal matrix completion. Information Theory, IEEE Transactions on, 56(5):2053-2080, 2010.

Carroll, J Douglas and Chang, Jih-Jie. Analysis of individual differences in multidimensional scaling via an n-way generalization of eckart-young decomposition. Psychometrika, 35(3):283-319, 1970.

Chen, Yudong, Bhojanapalli, Srinadh, Sanghavi, Sujay, and Ward, Rachel. Coherent matrix completion. In Proceedings of The 31st International Conference on Machine Learning, pp. 674-682, 2014.

Comon, Pierre. Tensor decompositions, state of the art and applications. arXiv:0905.0454, 2009. [OpenAIRE]

De Lathauwer, Lieven, De Moor, Bart, and Vandewalle, Joos. On the best rank-1 and rank-(r 1, r 2,..., rn) approximation of higher-order tensors. SIAM Journal on Matrix Analysis and Applications, 21(4):1324-1342, 2000.

39 references, page 1 of 3
Any information missing or wrong?Report an Issue