Abstract

Let $${\mathbb {F}}[D]$$ be the polynomial ring with entries in a finite field $${\mathbb {F}}$$ . Convolutional codes are submodules of $${\mathbb {F}}[D]^n$$ that can be described by left prime polynomial matrices. In the last decade there has been a great interest in convolutional codes equipped with a rank metric, called sum rank metric, due to their wide range of applications in reliable linear network coding. However, this metric suits only for delay free networks. In this work we continue this thread of research and introduce a new metric that overcomes this restriction and therefore is suitable to handle more general networks. We study this metric and provide characterizations of the distance properties in terms of the polynomial matrix representations of the convolutional code. Convolutional codes that are optimal with respect to this new metric are investigated and concrete constructions are presented. These codes are the analogs of Maximum Distance Profile convolutional codes in the context of network coding. Moreover, we show that they can be built upon a class of superregular matrices, with entries in an extension field, that preserve their superregularity properties even after multiplication with some matrices with entries in the ground field.

Highlights

  • Within the area of coding theory, network coding has been a very active topic of research as it provides an effective tool to disseminate information over networks

  • We can consider the transmitted packets as columns of a matrix with entries in a finite field Fq, and the linear combinations performed in the nodes of the network correspond to columns operations on this matrix

  • In order to state more precisely our results, we introduce the necessary material and notation on standard theory of network coding and convolutional codes

Read more

Summary

Introduction

Within the area of coding theory, network coding has been a very active topic of research as it provides an effective tool to disseminate information (packets) over networks. We show that the previous metrics are not enough to characterize the error correction capability of the code in these networks (see Example 1) and consider a new (rank) metric, called column rank distance, that solves this problem. This distance is the rank analog of the socalled column distance of Hamming convolutional codes, see [10, 14, 15]. We will show that the column rank distance characterizes the error correcting capability of the convolutional code within a time interval, in more general network channels (see Theorem 4). The problem of deriving superregular matrices to build convolutional codes has become an active area of research, see for instance [3, 6, 12], and the results presented extend the known results on this topic

Preliminaries
The network model
Multi-shot
Convolutional codes
Metrics for multi-shot network coding
MRD convolutional codes: A matrix characterization
Superregular matrices to build MRP codes
Block Toeplitz Superregular matrices
Conclusions

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.