
Linear prediction serves as a mathematical operation to estimate the future values of a discrete‐time signal based on a linear function of previous samples. When applied to predictive coding of waveform such as speech and audio, a common issue that plagues compression performance is the non‐stationary characteristics of prediction residuals around the starting point of the random access frames. This is because dependencies between prediction residuals and the historical waveform are interrupted to satisfy the random access requirement. In such cases, the dynamic range of the prediction residuals will fluctuate dramatically in such frames, leading to substantially poor coding performance in the subsequent entropy coder. In this study, the authors developed a solution to this long‐standing issue by establishing a theoretical relationship between the energy envelope of linear prediction residuals in the random access frames and the prediction coefficients. Using the established relationship, an adaptive normalisation method is formulated as a preprocessor to the entropy coder to mitigate the poor coding performance in the random access frames. Simulation results confirm the superiority of the proposed method over existing solutions in terms of coding efficiency performance.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 1 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
