Abstract

Abstract Soft-input soft-output decoding consists of estimating the information symbol a posteriori probablities, taking account of the coding constraints, given the received symbol a priori probabilities. Kullback's principle of cross-entropy minimization is used for deriving optimum decoding rules assuming a linear, possibly non-systematic, binary code. A log-likelihood formalism is used throughout. Two decoding rules results, one by solving a system of analog implicit non-linear equations written in terms of a generator matrix, the other being the conventional one. We discuss simplified versions of both, present an interpretation of the first rule and compare it with the second one, given that both are intended to provide approximates of the same quantities.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call