This study presents a divide-and-conquer (DC) approach based on feature space decomposition for classification. When large-scale datasets are present, typical approaches usually employed truncated kernel methods on the feature space or DC approaches on the sample space.... View more
 V. N. Vapnik and V. Vapnik, Statistical Learning Theory, vol. 2, New York, NY: Wiley, 1998.
 N. Cristianini and J. Shawe-Taylor, An Introduction to Support Vector Machines and Other Kernel-based Learning Methods, Cambridge: Cambridge University Press, 2000.
 S.-Y. Kung, Kernel Methods and Machine Learning, Cambridge: Cambridge University Press, 2014.
 S.-Y. Kung and P.-Y. Wu, “On efficient learning and classification kernel methods,” in Proc. 2012 IEEE Int. Conf. Acoustics, Speech and Signal Processing, Kyoto, Japan, Mar. 25-30, 2012, pp. 2065-2068.
 Y. Zhang, J. Duchi, and M. Wainwright, “Divide and conquer kernel ridge regression,” in Proc. 2013 Conf. on Learning Theory, Princeton, NJ, USA, Jun. 12-14, 2013, pp. 592-617.
 C.-J. Hsieh, S. Si, and I. S. Dhillon, “A divide-andconquer solver for kernel support vector machines,” in Proc. 31st Int. Conf. Machine Learning, Beijing, China, 2014, pp. 566-574.
 C. B. Moler and G. W. Stewart, “An algorithm for generalized matrix eigenvalue problems,” SIAM Journal on Numerical Analysis, vol. 10, no. 2, pp. 241-256, 1973.
 R.-E. Fan, K.-W. Chang, C.-J. Hsieh, X.-R. Wang, and C.-J. Lin, “Liblinear: A library for large linear classification,” J. Mach. Learn. Res., vol. 9, pp. 1871-1874, June 2008.
 T. Joachims, “Making large-scale SVM learning practical,” in Advances in Kernel Methods - Support Vector Learning, B. Scho¨ lkopf, C. Burges, and A. Smola, Eds. Cambridge, MA: MIT Press, 1999.
 W. Kao, K. Chung, C. Sun, and C. Lin, “Decomposition methods for linear support vector machines,” Neural Computation, vol. 16, no. 8, pp. 1689-1704, Aug 2004.