
In this article we present a new class of support vector machines for binary classification task. Our support vector machines are constructed using only two support vectors and have very low Vapnik-Chervonenkis dimension, so they generalize well. Geometrically, our approach is based on searching of a proper pair of observations from different classes of explained variable. Once this pair is found the discriminant hyperplane becomes orthogonal to the line connecting these observations. This method deals well with data sets with large number of features and small number of observations like gene expression data. We illustrate the performance of our classification method using gene expression data and show that it is superior to other classifiers especially to diagonal linear discriminant analysis and k-nearest neighbor which achieved the lowest error rate in the previous studies of tumor classification.
support vector machines, Vapnik-Chervonenkis dimension, microarray experiment, tumor classification.
support vector machines, Vapnik-Chervonenkis dimension, microarray experiment, tumor classification.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
