publication . Preprint . 2009

Distribution-Specific Agnostic Boosting

Feldman, Vitaly;
Open Access English
  • Published: 16 Sep 2009
Abstract
We consider the problem of boosting the accuracy of weak learning algorithms in the agnostic learning framework of Haussler (1992) and Kearns et al. (1992). Known algorithms for this problem (Ben-David et al., 2001; Gavinsky, 2002; Kalai et al., 2008) follow the same strategy as boosting algorithms in the PAC model: the weak learner is executed on the same target function but over different distributions on the domain. We demonstrate boosting algorithms for the agnostic learning framework that only modify the distribution on the labels of the points (or, equivalently, modify the target function). This allows boosting a distribution-specific weak agnostic learner...
Subjects
free text keywords: Computer Science - Learning, Computer Science - Computational Complexity
Related Organizations
Download from
33 references, page 1 of 3

[1] B. Barak, M. Hardt, and S. Kale. The uniform hardcore lemma via approximate bregman projections. In Proceedings of SODA, pages 1193-1200, 2009.

[2] S. Ben-David, P. Long, and Y. Mansour. Agnostic boosting. In Proceedigns COLT/EuroCOLT, pages 507-516, 2001.

[3] A. Blum, M. Furst, J. Jackson, M. Kearns, Y. Mansour, and S. Rudich. Weakly learning DNF and characterizing statistical query learning using Fourier analysis. In Proceedings of STOC, pages 253-262, 1994.

[4] N. Bshouty, J. Jackson, and C. Tamon. More efficient PAC-learning of DNF with membership queries under the uniform distribution. Journal of Computer and System Sciences, 68(1):205-234, 2004. [OpenAIRE]

[5] N. Bshouty, E. Mossel, R. O'Donnell, and R. Servedio. Learning DNF from random walks. In Proceedings of FOCS, pages 189-199, 2003.

[6] T. Dietterich. An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization. Machine Learning, 40(2):139-157, 2000. [OpenAIRE]

[7] C. Domingo and O. Watanabe. MadaBoost: a modified version of AdaBoost. In Proceedings of COLT, pages 180-189, 2000.

[8] V. Feldman. A complete characterization of statistical query learning with applications to evolvability. Manuscript (to appear in Proceeding of FOCS), Apr. 2009.

[9] V. Feldman, P. Gopalan, S. Khot, and A. Ponuswami. New results for learning noisy parities and halfspaces. In Proceedings of FOCS, pages 563-574, 2006. [OpenAIRE]

[10] Y. Freund. An adaptive version of the boost-by-majority algorithm. In Proceedings of the Twelfth Annual Conference on Computational Learning Theory, pages 102-113, 1999.

[11] Y. Freund and R. Schapire. A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences, 55(1):119-139, 1997. [OpenAIRE]

[12] D. Gavinsky. Optimally-smooth adaptive boosting and application to agnostic learning. Journal of Machine Learning Research, 4:101-117, 2003.

[13] O. Goldreich and L. Levin. A hard-core predicate for all one-way functions. In Proceedings of STOC, pages 25-32, 1989. [OpenAIRE]

[14] P. Gopalan, A. Kalai, and A. Klivans. Agnostically learning decision trees. In Proceedings of STOC, pages 527-536, 2008. [OpenAIRE]

[15] A. Hajnal, W. Maass, P. Pudlak, M. Szegedy, and G. Turan. Threshold circuits of bounded depth. Journal of Computer and System Sciences, 46:129-154, 1993. [OpenAIRE]

33 references, page 1 of 3
Any information missing or wrong?Report an Issue