International audience; Entropy gain is widely used for learning decision trees. However, as we go deeper downward the tree, the examples become rarer and the faithfulness of entropy decreases. Thus, misleading choices and over-fitting may occur and the tree has to be a... View more
Abella`n, J. and Moral, S. Upper entropy of credal sets. applications to credal classification. International Journal of Approximate Reasoning, 39:235-255, 2005.
Agresti, A. and Coull, B.A. Approximate Is Better than ”Exact” for Interval Estimation of Binomial Proportions. The American Statistician, 52(2):119-126, May 1998.
Bernard, J.M. An introduction to the imprecise dirichlet model for multinomial data. International Journal of Approximate Reasoning, 39(23):123 - 150, 2005. Imprecise Probabilities and Their Applications.
Breiman, L., Friedman, J.H., Olshen, R.A., and Stone, C.J. Classification and Regression Trees. Chapman & Hall, New York, NY, 1984.
Buntine, W. and Niblett, T. A further comparison of splitting rules for decision-tree induction. Machine Learning, 8(1):75-85, 1992.
Demsˇar, J. Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res., 7:1-30, December 2006.
Domingos, P. and Geoff, H. Mining high-speed data streams. In Proceedings of the Sixth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD '00, pp. 71-80, New York, NY, USA, 2000. ACM.
Dubois, D. Possibility theory and statistical reasoning. Computational Statistics and Data Analysis, 51:47-69, 2006.
Dubois, D. and Hu¨llermeier, E. Comparing probability measures using possibility theory: A notion of relative peakedness. International Journal of Approximate Reasoning, 45(2):364-385, 2007.
Dubois, D. and Prade, H. When upper probabilities are possibility measures. Fuzzy Sets and Systems, 49:65-74, 1992.