
Neuhoff and Gilbert (1982) defined a causal lossy source code as a system where the reconstruction of the present source sample is restricted to be a function of the present and past source samples, while the code stream itself may be non-causal and have variable rate. They showed that for stationary and memoryless sources, optimum causal source coding is achieved by time-sharing at most two entropy coded scalar quantizers. We extend this result to general real valued stationary sources with finite differential entropy rate, in the limit of small distortions. We show that for the mean square distortion, the optimum causal encoder at high resolution is a fixed uniform quantizer followed by a sequence entropy coder. Thus, the cost of causality is the "space filling loss" of the uniform quantizer, i.e., (1/2)log(2/spl pi/e/12)/spl ap/0.254 bit. This generalizes the well known result of Gish and Pierce on asymptotically optimal entropy constrained scalar quantization.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 5 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
