Although the 3G (UMTS) systems physical-layer performance is still described by both bit-error-rate and block-error-rate, the 4G (LTE) uses exclusively the latter that is at higher protocol layers commonly referred to as frame-error-rate or packet-error-rate of downlink and uplink input data streams. However, the block-error-rate estimation scheme heavily depends on the success rate of negative acknowledgements transport via return channel, and so does not provide trustworthy low values (e.g. around $$10^{-5})$$10-5). The remaining errors that are not corrected by the physical layer, determine the so-called residual channel and need to be corrected by higher protocols to provide almost errorless application environment. Error rate estimation is therefore important not only in research, development and manufacturing, but also in network operator's environment. However, relationship between the bit-error and the block-error oriented performance metrics is not always a simple one, even at physical layer alone, as well as through higher-protocol layers. In this paper, we benchmark in practice both types of performance metrics for consistency, specifically on an operational microwave link residual channel, bottom upwards to the application-layer, end-to-end perceptual speech quality measurements.
Read full abstract