publication . Conference object . Preprint . 2017

efficient low rank tensor ring completion

Wang, Wenqi; Aggarwal, Vaneet; Aeron, Shuchin;
Open Access
  • Published: 23 Jul 2017
  • Publisher: IEEE
Using the matrix product state (MPS) representation of the recently proposed tensor ring decompositions, in this paper we propose a tensor completion algorithm, which is an alternating minimization algorithm that alternates over the factors in the MPS representation. This development is motivated in part by the success of matrix completion algorithms that alternate over the (low-rank) factors. In this paper, we propose a spectral initialization for the tensor ring completion algorithm and analyze the computational complexity of the proposed algorithm. We numerically compare it with existing methods that employ a low rank tensor train approximation for data compl...
ACM Computing Classification System: MathematicsofComputing_NUMERICALANALYSIS
free text keywords: Artificial intelligence, business.industry, business, Pattern recognition, Algorithm, Matrix product state, Matrix decomposition, Matrix completion, Initialization, Tensor, Computational complexity theory, Algorithm design, Computer science, Approximation algorithm, Computer Science - Learning, Computer Science - Information Theory
17 references, page 1 of 2

[1] T. Kolda and B. Bader, “Tensor decompositions and applications,” SIAM Review, vol. 51, no. 3, pp. 455-500, 2009.

[2] A. Cichocki, D. Mandic, L. De Lathauwer, G. Zhou, Q. Zhao, C. Caiafa, and H. A. Phan, “Tensor decompositions for signal processing applications: From two-way to multiway component analysis,” IEEE Signal Processing Magazine, vol. 32, no. 2, pp. 145-163, 2015.

[3] M. Vasilescu and D. Terzopoulos, “Multilinear image analysis for face recognition,” Proceedings of the International Conference on Pattern Recognition ICPR 2002, vol. 2, pp. 511-514, 2002, quebec City, Canada. [OpenAIRE]

[4] A. Novikov, D. Podoprikhin, A. Osokin, and D. P. Vetrov, “Tensorizing neural networks,” in Advances in Neural Information Processing Systems, 2015, pp. 442-450. [OpenAIRE]

[5] M. Ashraphijuo, V. Aggarwal, and X. Wang, “Deterministic and probabilistic conditions for finite completability of low rank tensor,” arXiv preprint arXiv:1612.01597, 2016.

[6] Q. Zhao, G. Zhou, S. Xie, L. Zhang, and A. Cichocki, “Tensor ring decomposition,” arXiv preprint arXiv:1606.05535, 2016.

[7] R. Oru´s, “A practical introduction to tensor networks: Matrix product states and projected entangled pair states,” Annals of Physics, vol. 349, pp. 117-158, 2014.

[8] P. Jain, P. Netrapalli, and S. Sanghavi, “Low-rank matrix completion using alternating minimization,” in Proceedings of the forty-fifth annual ACM symposium on Theory of computing. ACM, 2013, pp. 665-674.

[9] M. Hardt, “On the provable convergence of alternating minimization for matrix completion,” CoRR, vol. abs/1312.0925, 2013. [Online]. Available:

[10] I. V. Oseledets, “Tensor-train decomposition,” SIAM Journal on Scientific Computing, vol. 33, no. 5, pp. 2295-2317, 2011. [OpenAIRE]

[11] L. Grasedyck, M. Kluge, and S. Kramer, “Variants of alternating least squares tensor completion in the tensor train format,” SIAM Journal on Scientific Computing, vol. 37, no. 5, pp. A2424-A2450, 2015.

[12] H. N. Phien, H. D. Tuan, J. A. Bengua, and M. N. Do, “Efficient tensor completion: Low-rank tensor train,” arXiv preprint arXiv:1601.01083, 2016. [OpenAIRE]

[13] A. Cichocki, “Tensor networks for big data analytics and large-scale optimization problems,” arXiv preprint arXiv:1407.3124, 2014. [OpenAIRE]

[14] S. Holtz, T. Rohwedder, and R. Schneider, “On manifolds of tensors of fixed tt-rank,” Numerische Mathematik, vol. 120, no. 4, pp. 701-731, 2012.

[15] “Albert Einstein image,” 6/albert einstein by zuzahin-d5pcbug.jpg.

17 references, page 1 of 2
Powered by OpenAIRE Open Research Graph
Any information missing or wrong?Report an Issue
publication . Conference object . Preprint . 2017

efficient low rank tensor ring completion

Wang, Wenqi; Aggarwal, Vaneet; Aeron, Shuchin;