publication . Preprint . Part of book or chapter of book . 2017

Scalable Nonlinear AUC Maximization Methods

Indrakshi Ray; Hamidreza Chitsaz; Majdi Khalid;
Open Access English
  • Published: 02 Oct 2017
The area under the ROC curve (AUC) is a widely used measure for evaluating classification performance on heavily imbalanced data. The kernelized AUC maximization machines have established a superior generalization ability compared to linear AUC machines because of their capability in modeling the complex nonlinear structures underlying most real-world data. However, the high training complexity renders the kernelized AUC machines infeasible for large-scale data. In this paper, we present two nonlinear AUC maximization algorithms that optimize linear classifiers over a finite-dimensional feature space constructed via the k-means Nystrom approximation. Our first a...
Persistent Identifiers
arXiv: Computer Science::Machine LearningComputer Science::Information RetrievalStatistics::Machine Learning
ACM Computing Classification System: ComputingMethodologies_PATTERNRECOGNITION
free text keywords: Computer Science - Machine Learning, Scalability, Hinge loss, Algorithm, Maximization, Convergence (routing), Classifier (linguistics), Regularization (mathematics), Computer science, Feature vector, Pairwise comparison
Download fromView all 2 versions
Part of book or chapter of book
Provider: UnpayWall
Part of book or chapter of book . 2019
Provider: Crossref
31 references, page 1 of 3

1. Airola, A., Pahikkala, T., Salakoski, T.: Training linear ranking svms in linearithmic time using red{black trees. Pattern Recognition Letters 32(9), 1328{1336 (2011)

2. Bordes, A., Bottou, L., Gallinari, P.: Sgd-qn: Careful quasi-newton stochastic gradient descent. Journal of Machine Learning Research 10(Jul), 1737{1754 (2009) [OpenAIRE]

3. Chapelle, O., Keerthi, S.S.: E cient algorithms for ranking with svms. Information Retrieval 13(3), 201{215 (2010)

4. Chaudhuri, S., Theocharous, G., Ghavamzadeh, M.: Recommending advertisements using ranking functions (Jan 18 2016), uS Patent App. 14/997,987

5. Cortes, C., Mohri, M.: Auc optimization vs. error rate minimization. Advances in neural information processing systems 16(16), 313{320 (2004)

6. Felix, X.Y., Kumar, S., Rowley, H., Chang, S.F.: Compact nonlinear maps and circulant extensions. stat 1050, 12 (2015)

7. Gao, W., Jin, R., Zhu, S., Zhou, Z.H.: One-pass auc optimization. In: ICML (3). pp. 906{914 (2013)

8. Gu, Q., Han, J.: Clustered support vector machines. In: Arti cial Intelligence and Statistics. pp. 307{315 (2013)

9. Hanley, J.A., McNeil, B.J.: The meaning and use of the area under a receiver operating characteristic (roc) curve. Radiology 143(1), 29{36 (1982)

10. Herschtal, A., Raskutti, B.: Optimising area under the roc curve using gradient descent. In: Proceedings of the twenty- rst international conference on Machine learning. p. 49. ACM (2004) [OpenAIRE]

11. Hu, J., Yang, H., King, I., Lyu, M.R., So, A.M.C.: Kernelized online imbalanced learning with xed budgets. In: AAAI. pp. 2666{2672 (2015)

12. Joachims, T.: A support vector method for multivariate performance measures. In: Proceedings of the 22nd international conference on Machine learning. pp. 377{384. ACM (2005) [OpenAIRE]

13. Jose, C., Goyal, P., Aggrwal, P., Varma, M.: Local deep kernel learning for e cient non-linear svm prediction. In: International Conference on Machine Learning. pp. 486{494 (2013)

14. Khalid, M., Ray, I., Chitsaz, H.: Con dence-weighted bipartite ranking. In: Advanced Data Mining and Applications: 12th International Conference, ADMA 2016, Gold Coast, QLD, Australia, December 12-15, 2016, Proceedings 12. pp. 35{49. Springer (2016)

15. Kotlowski, W., Dembczynski, K.J., Huellermeier, E.: Bipartite ranking through minimization of univariate loss. In: Proceedings of the 28th International Conference on Machine Learning (ICML-11). pp. 1113{1120 (2011) [OpenAIRE]

31 references, page 1 of 3
Any information missing or wrong?Report an Issue