publication . Article . Other literature type . 2003

estimation of entropy and mutual information

Paninski, Liam;
Open Access
  • Published: 01 Jun 2003 Journal: Neural Computation, volume 15, pages 1,191-1,253 (issn: 0899-7667, eissn: 1530-888X, Copyright policy)
  • Publisher: MIT Press - Journals
Abstract
We present some new results on the nonparametric estimation of entropy and mutual information. First, we use an exact local expansion of the entropy function to prove almost sure consistency and central limit theorems for three of the most commonly used discretized information estimators. The setup is related to Grenander's method of sieves and places no assumptions on the underlying probability measure generating the data. Second, we prove a converse to these consistency theorems, demonstrating that a misapplication of the most common estimation techniques leads to an arbitrarily poor estimate of the true information, even given unlimited data. This "inconsiste...
Subjects
free text keywords: Mathematical optimization, Applied mathematics, Information theory, Mutual information, Probability distribution, Entropy estimation, Binary entropy function, Mathematics, Estimator, Bias of an estimator, Consistent estimator, Calculus
Related Organizations
Powered by OpenAIRE Research Graph
Any information missing or wrong?Report an Issue