
arXiv: 2310.18419
handle: 11568/1276331 , 11382/566932 , 1887/4171972
Shannon entropy is the shortest average codeword length a lossless compressor can achieve by encoding i.i.d. symbols. However, there are cases in which the objective is to minimize the \textit{exponential} average codeword length, i.e. when the cost of encoding/decoding scales exponentially with the length of codewords. The optimum is reached by all strategies that map each symbol $x_i$ generated with probability $p_i$ into a codeword of length $\ell^{(q)}_D(i)=-\log_D\frac{p_i^q}{\sum_{j=1}^Np_j^q}$. This leads to the minimum exponential average codeword length, which equals the Rényi, rather than Shannon, entropy of the source distribution. We generalize the established Arithmetic Coding (AC) compressor to this framework. We analytically show that our generalized algorithm provides an exponential average length which is arbitrarily close to the Rényi entropy, if the symbols to encode are i.i.d.. We then apply our algorithm to both simulated (i.i.d. generated) and real (a piece of Wikipedia text) datasets. While, as expected, we find that the application to i.i.d. data confirms our analytical results, we also find that, when applied to the real dataset (composed by highly correlated symbols), our algorithm is still able to significantly reduce the exponential average codeword length with respect to the classical `Shannonian' one. Moreover, we provide another justification of the use of the exponential average: namely, we show that by minimizing the exponential average length it is possible to minimize the probability that codewords exceed a certain threshold length. This relation relies on the connection between the exponential average and the cumulant generating function of the source distribution, which is in turn related to the probability of large deviations. We test and confirm our results again on both simulated and real datasets.
22 pages, 9 figures
FOS: Computer and information sciences, E.4; H.1.1, Computer Science - Information Theory, Information Theory (cs.IT), E.4, FOS: Physical sciences, Arithmetic; Arithmetic coding; Campbell theorem; Channel coding; Codes; Costs; Data compression; Entropy; Europe; Large deviations; Rényi entropy; Symbols, Symbols; Channel coding; Entropy; Costs; Arithmetic; Europe; Data compression; Codes; Arithmetic coding; Campbell theorem; large deviations; R & eacute;nyi entropy, 68P30, 94A29, 94A17, Physics - Data Analysis, Statistics and Probability, Computer Science - Data Structures and Algorithms, Data Structures and Algorithms (cs.DS), Data Analysis, Statistics and Probability (physics.data-an), H.1.1
FOS: Computer and information sciences, E.4; H.1.1, Computer Science - Information Theory, Information Theory (cs.IT), E.4, FOS: Physical sciences, Arithmetic; Arithmetic coding; Campbell theorem; Channel coding; Codes; Costs; Data compression; Entropy; Europe; Large deviations; Rényi entropy; Symbols, Symbols; Channel coding; Entropy; Costs; Arithmetic; Europe; Data compression; Codes; Arithmetic coding; Campbell theorem; large deviations; R & eacute;nyi entropy, 68P30, 94A29, 94A17, Physics - Data Analysis, Statistics and Probability, Computer Science - Data Structures and Algorithms, Data Structures and Algorithms (cs.DS), Data Analysis, Statistics and Probability (physics.data-an), H.1.1
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
