Abstract

Consider a channel whose the input alphabet set $\mathbb {X}=\{x_{1},x_{2}, {\dots },x_{K}\}$ contains $K$ discrete symbols modeled as a discrete random variable $X$ having a probability mass function $\mathbf {p}(\mathbf {x}) = [p(x_{1}), p(x_{2}), {\dots }, p(x_{K})]$ and the received signal $Y$ being a continuous random variable. $Y$ is a distorted version of $X$ caused by a channel distortion, characterized by the conditional densities $p(y|x_{i})=\phi _{i}(y)$ , $i=1,2, {\dots },K$ . To recover $X$ , a quantizer $Q$ is used to quantize $Y$ back to a discrete output $\mathbb {Z} =\{z_{1}, z_{2}, {\dots }, z_{N}\}$ corresponding to a random variable $Z$ with a probability mass function $\mathbf {p}(\mathbf {z}) = [p(z_{1}), p(z_{2}), {\dots }, p(z_{N})]$ such that the mutual information $I(X;Z)$ is maximized subject to an arbitrary constraint on $\mathbf {p}(\mathbf {z})$ . Formally, we are interested in designing an optimal quantizer $Q^{*}$ that maximizes $\beta I(X;Z) - C(Z)$ where $\beta $ is a positive number that controls the trade-off between maximizing $I(X;Z)$ and minimizing an arbitrary cost function $C(Z)$ . Let $\mathbf {p}(\mathbf {x}|y)=[p(x_{1}|y),p(x_{2}|y), {\dots },p(x_{K}|y)]$ be the posterior distribution of $X$ for a given value of $y$ , we show that for any arbitrary cost function $C(.)$ , the optimal quantizer $Q^{*}$ separates the vectors $\mathbf {p}(\mathbf {x}|y)$ into convex regions. Using this result, a method is proposed to determine an upper bound on the number of thresholds (decision variables on $y$ ) which is used to speed up the algorithm for finding an optimal quantizer. Numerical results are presented to validate the findings.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call