
arXiv: 1708.08103
Motivated from the fact that universal source coding on countably infinite alphabets is not feasible, this work introduces the notion of almost lossless source coding. Analog to the weak variable-length source coding problem studied by Han (IEEE TIT, 2000, 46, 1217-1226), almost lossless source coding aims at relaxing the lossless block-wise assumption to allow an average per-letter distortion that vanishes asymptotically as the block-length tends to infinity. In this setup, we show on one hand that Shannon entropy characterizes the minimum achievable rate (similarly to the case of finite alphabet sources) while on the other that almost lossless universal source coding becomes feasible for the family of finite-entropy stationary memoryless sources with infinite alphabets. Furthermore, we study a stronger notion of almost lossless universality that demands uniform convergence of the average per-letter distortion to zero, where we establish a necessary and sufficient condition for the so-called family of envelope distributions to achieve it. Remarkably, this condition is the same necessary and sufficient condition needed for the existence of a strongly minimax (lossless) universal source code for the family of envelope distributions. Finally, we show that an almost lossless coding scheme offers faster rate of convergence for the (minimax) redundancy compared to the well-known information radius developed for the lossless case at the expense of tolerating a non-zero distortion that vanishes to zero as the block-length grows. This shows that even when lossless universality is feasible, an almost lossless scheme can offer different regimes on the rates of convergence of the (worst case) redundancy versus the (worst case) distortion.
This article has been accepted for publication by IEEE. Digital Object Identifier 10.1109/TIT.2019.2941895. Link: https://ieeexplore.ieee.org/document/8840879. The material in this paper was partially published in ISIT2016 [1] and ISIT2017 [2], International Symposium on Information Theory (ISIT)
FOS: Computer and information sciences, [SPI] Engineering Sciences [physics], Computer Science - Information Theory, Information Theory (cs.IT)
FOS: Computer and information sciences, [SPI] Engineering Sciences [physics], Computer Science - Information Theory, Information Theory (cs.IT)
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 5 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
