
The Gaussian law reigns supreme in the information theory of analog random variables. This paper showcases a number of information theoretic results which find elegant counterparts for Cauchy distributions. New concepts such as that of equivalent pairs of probability measures and the strength of real-valued random variables are introduced here and shown to be of particular relevance to Cauchy distributions.
Rényi divergence, Science, Physics, QC1-999, relative entropy, Q, Kullback–Leibler divergence, data transmission, Astrophysics, Article, differential entropy, QB460-466, lossy data compression, <i>f</i>-divergence, Cauchy distribution, Fisher’s information, mutual information, entropy power inequality, information measures
Rényi divergence, Science, Physics, QC1-999, relative entropy, Q, Kullback–Leibler divergence, data transmission, Astrophysics, Article, differential entropy, QB460-466, lossy data compression, <i>f</i>-divergence, Cauchy distribution, Fisher’s information, mutual information, entropy power inequality, information measures
| citations This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 7 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 10% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Top 10% |
