12 references, page 1 of 2 [1] V. N. Vapnik and V. Vapnik, Statistical Learning Theory, vol. 2, New York, NY: Wiley, 1998.

[2] N. Cristianini and J. Shawe-Taylor, An Introduction to Support Vector Machines and Other Kernel-based Learning Methods, Cambridge: Cambridge University Press, 2000.

[3] S.-Y. Kung, Kernel Methods and Machine Learning, Cambridge: Cambridge University Press, 2014.

[4] S.-Y. Kung and P.-Y. Wu, “On efficient learning and classification kernel methods,” in Proc. 2012 IEEE Int. Conf. Acoustics, Speech and Signal Processing, Kyoto, Japan, Mar. 25-30, 2012, pp. 2065-2068.

[5] Y. Zhang, J. Duchi, and M. Wainwright, “Divide and conquer kernel ridge regression,” in Proc. 2013 Conf. on Learning Theory, Princeton, NJ, USA, Jun. 12-14, 2013, pp. 592-617.

[6] C.-J. Hsieh, S. Si, and I. S. Dhillon, “A divide-andconquer solver for kernel support vector machines,” in Proc. 31st Int. Conf. Machine Learning, Beijing, China, 2014, pp. 566-574.

[7] C. B. Moler and G. W. Stewart, “An algorithm for generalized matrix eigenvalue problems,” SIAM Journal on Numerical Analysis, vol. 10, no. 2, pp. 241-256, 1973.

[8] R.-E. Fan, K.-W. Chang, C.-J. Hsieh, X.-R. Wang, and C.-J. Lin, “Liblinear: A library for large linear classification,” J. Mach. Learn. Res., vol. 9, pp. 1871-1874, June 2008.

[9] T. Joachims, “Making large-scale SVM learning practical,” in Advances in Kernel Methods - Support Vector Learning, B. Scho¨ lkopf, C. Burges, and A. Smola, Eds. Cambridge, MA: MIT Press, 1999.

[10] W. Kao, K. Chung, C. Sun, and C. Lin, “Decomposition methods for linear support vector machines,” Neural Computation, vol. 16, no. 8, pp. 1689-1704, Aug 2004.