Abstract

Coding techniques that are employed to resolve packet losses on wireless channels, such as Random Linear Network Coding (RLNC), market themselves with the advantage of requiring less signalling, as well as, retransmissions of missing packets to compensate for losses on unreliable links. However, as packet sizes of IP-based protocols can be distributed irregularly over the available maximum frame size and can vary considerably packet-by-packet, the current implementations of RLNC suffer from shifting header and/or payload lengths and lack a suitable way of compensation. The simplest solution adopted by most RLNC approaches is to pad the unequal packets with zeros to the maximum packet size, thus creating an unnecessary transmission overhead of 100 % or more. This paper presents a practical implementation of the new progressive shortening based RLNC for the first time. This scheme utilizes fixed-sized regions inside the packets to resolve the zero-padding overhead and generates unequal-sized coded packets. Furthermore, it introduces a recoding feature for this scheme, which is another advantage of RLNC over other coding techniques, and breaks with the point-to-point topology considered in previous works. Moreover, we combine this novel macro-symbol based coding scheme with Robust Header Compression version 2 (RoHCv2) to show the gain over traditional implementation of RLNC in a real-life application where varying packet lengths dominate during transmissions. Our implementations results, using the KODO network coding library, show that the encoding throughput is as good as the established RLNC methods, and the payload delivery efficiency can be enhanced by up to 20 %.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.