arxiv: Computer Science::Computer Vision and Pattern Recognition | Mathematics::General Topology | Astrophysics::High Energy Astrophysical Phenomena
The Nearest subspace classifier (NSS) finds an estimation of the underlying subspace within each class and assigns data points to the class that corresponds to its nearest subspace. This paper mainly studies how well NSS can be generalized to new samples. It is proved t... View more
 Francis Bach and Jean-Yves Audibert. Supervised learning for computer vision: Theory and algorithms, 2008.
 Peter L. Bartlett and Mikhail Traskin. Adaboost is consistent. Journal of Machine Learning Research, 8:2347-2368, 2007.
 R. Basri and D. Jacobs. Lambertian reflectance and linear subspaces. IEEE Transactions on Pattern Analysis and Machine Intelligence, 25(2):218-233, February 2003.
 Christopher M. Bishop. Pattern Recognition and Machine Learning (Information Science and Statistics). Springer-Verlag New York, Inc., Secaucus, NJ, USA, 2006.
 Gilles Blanchard, G´bor Lugosi, and Nicolas Vayatis. On the rate of convergence of regularized boosting classifiers. J. Mach. Learn. Res., 4:861-894, December 2003.
 Bernhard E. Boser, Isabelle M. Guyon, and Vladimir N. Vapnik. A training algorithm for optimal margin classifiers. In Proceedings of the Fifth Annual Workshop on Computational Learning Theory, COLT '92, pages 144-152, New York, NY, USA, 1992. ACM.
 L. Breiman, J. Friedman, R. Olshen, and C. Stone. Classification and Regression Trees. Wadsworth and Brooks, Monterey, CA, 1984.
 Raffaele Cappelli, Dario Maio, and Davide Maltoni. Multispace kl for pattern representation and classification. IEEE Transactions on Pattern Analysis and Machine Intelligence, 23(9):977-996, September 2001.
 Hakan Cevikalp, Diane Larlus, Marian Neamtu, Bill Triggs, and Frdric Jurie. Manifold based local classifiers: Linear and nonlinear approaches. Signal Processing Systems, 61(1):61-73, 2010.
 Corinna Cortes and Vladimir Vapnik. Support-vector networks. Machine Learning, 20(3):273-297, September 1995.