publication . Article . Preprint . Other literature type . 2017

Fast multi-output relevance vector regression

Youngmin Ha; Hai Zhang;
Open Access
  • Published: 17 Apr 2017 Journal: Economic Modelling, volume 81, pages 217-230 (issn: 0264-9993, Copyright policy)
  • Publisher: Elsevier BV
  • Country: Germany
Abstract
This paper has applied the matrix Gaussian distribution of the likelihood function of the complete data set to reduce time complexity of multi-output relevance vector regression from O(VM^3) to O(V^3 +M^3), where V and M are the number of output dimensions and basis functions respectively and V < M. Our experimental results demonstrate that the proposed method is more competitive and faster than the existing methods like Thayananthan et al. (2008). Its computational efficiency and accuracy can be attributed to the different model specifications of the likelihood of the data, as the existing method expresses the likelihood of the training data as the product of G...
Subjects
free text keywords: Economics and Econometrics, Time complexity, Regression, Gaussian, symbols.namesake, symbols, Likelihood function, Computer science, Matrix (mathematics), Algorithm, Basis function, Training set, Computer Science - Learning, Statistics - Machine Learning, HB
Related Organizations
29 references, page 1 of 2

Alvarez, M. and Lawrence, N.D., Sparse convolved Gaussian processes for multi-output regression. In Proceedings of the Advances in Neural Information Processing Systems, pp. 57--64, 2009.

Anderson, T.W., An introduction to multivariate statistical analysis (2 edn), 1984, Wiley.

Arnold, S.F., The theory of linear models and multivariate analysis, 1981, Wiley.

Ben-Shimon, D. and Shmilovici, A., Accelerating the relevance vector machine via data partitioning. Foundations of Computing and Decision Sciences, 2006, 31, 27--41.

Bonilla, E.V., Chai, K.M.A. and Williams, C.K.I., Multi-task Gaussian process prediction. In Proceedings of the Advances in Neural Information Processing Systems, pp. 153--160, 2007.

Boyle, P. and Frean, M.R., Dependent Gaussian Processes.. In Proceedings of the Advances in Neural Information Processing Systems, pp. 217--224, 2004.

Catanzaro, B., Sundaram, N. and Keutzer, K., Fast support vector machine training and classification on graphics processors. In Proceedings of the Proceedings of the 25th international conference on Machine learning, pp. 104--111, 2008. [OpenAIRE]

Chang, C.C. and Lin, C.J., LIBSVM: A library for support vector machines. ACM Transactions on Intelligent Systems and Technology, 2011, 2, 27:1--27.

Chu, W., Keerthi, S.S. and Ong, C.J., Bayesian support vector regression using a unified loss function. IEEE Transactions on Neural Networks, 2004, 15, 29--44. [OpenAIRE]

Cortes, C. and Vapnik, V., Support-vector networks. Machine learning, 1995, 20, 273--297.

Gao, J.B., Gunn, S.R., Harris, C.J. and Brown, M., A probabilistic framework for SVM regression and error bar estimation. Machine Learning, 2002, 46, 71--89.

Gibbs, M., Bayesian Gaussian processes for classification and regression. PhD thesis, University of Cambridge, 1997.

Gramacy, R.B., Niemi, J. and Weiss, R.M., Massively parallel approximate Gaussian process regression. SIAM/ASA Journal on Uncertainty Quantification, 2014, 2, 564--584.

Guo, G. and Zhang, J.S., Reducing examples to accelerate support vector regression. Pattern Recognition Letters, 2007, 28, 2173--2183.

Montgomery, D.C., Applied Statistics and Probability for Engineers (6 edn), 2013, Wiley.

29 references, page 1 of 2
Powered by OpenAIRE Open Research Graph
Any information missing or wrong?Report an Issue
publication . Article . Preprint . Other literature type . 2017

Fast multi-output relevance vector regression

Youngmin Ha; Hai Zhang;