We consider the problem of boosting the accuracy of weak learning algorithms in the agnostic learning framework of Haussler (1992) and Kearns et al. (1992). Known algorithms for this problem (Ben-David et al., 2001; Gavinsky, 2002; Kalai et al., 2008) follow the same st... View more
 B. Barak, M. Hardt, and S. Kale. The uniform hardcore lemma via approximate bregman projections. In Proceedings of SODA, pages 1193-1200, 2009.
 S. Ben-David, P. Long, and Y. Mansour. Agnostic boosting. In Proceedigns COLT/EuroCOLT, pages 507-516, 2001.
 A. Blum, M. Furst, J. Jackson, M. Kearns, Y. Mansour, and S. Rudich. Weakly learning DNF and characterizing statistical query learning using Fourier analysis. In Proceedings of STOC, pages 253-262, 1994.
 N. Bshouty, J. Jackson, and C. Tamon. More efficient PAC-learning of DNF with membership queries under the uniform distribution. Journal of Computer and System Sciences, 68(1):205-234, 2004.
 N. Bshouty, E. Mossel, R. O'Donnell, and R. Servedio. Learning DNF from random walks. In Proceedings of FOCS, pages 189-199, 2003.
 T. Dietterich. An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization. Machine Learning, 40(2):139-157, 2000.
 C. Domingo and O. Watanabe. MadaBoost: a modified version of AdaBoost. In Proceedings of COLT, pages 180-189, 2000.
 V. Feldman. A complete characterization of statistical query learning with applications to evolvability. Manuscript (to appear in Proceeding of FOCS), Apr. 2009.
 V. Feldman, P. Gopalan, S. Khot, and A. Ponuswami. New results for learning noisy parities and halfspaces. In Proceedings of FOCS, pages 563-574, 2006.
 Y. Freund. An adaptive version of the boost-by-majority algorithm. In Proceedings of the Twelfth Annual Conference on Computational Learning Theory, pages 102-113, 1999.