Abstract

Communication of a multivariate Gaussian source transmitted over orthogonal additive white Gaussian noise channels using delay-free joint source channel codes (JSCC) is studied in this paper. Two scenarios are considered: (1) all components of the multivariate Gaussian are transmitted by one encoder as a vector or several ideally collaborating nodes in a network; (2) the multivariate Gaussian is transmitted through distributed nodes in a sensor network. In both scenarios, the goal is to recover all components of the multivariate Gaussian at the receiver. The paper investigates a subset of JSCC consisting of direct source-to-channel mappings that operate on a symbol-by-symbol basis to ensure zero coding delay. A theoretical analysis that helps explain and quantify distortion behavior for such JSCC is given. Relevant performance bounds for the network are also derived with no constraints on complexity and delay. Optimal linear schemes for both scenarios are presented. Results for Scenario 1 show that linear mappings perform well, except when correlation is high. In Scenario 2, linear mappings provide no gain from correlation when the channel signal-to-noise ratio (SNR) gets large. The gap to the performance upper bound is large for both scenarios, regardless of SNR, when the correlation is high. The main contribution of this paper is the investigation of nonlinear mappings for both scenarios. It is shown that nonlinear mappings can provide substantial gain compared to optimal linear schemes when correlation is high. Contrary to linear mappings for Scenario 2, carefully chosen nonlinear mappings provide a gain for all SNR, as long as the correlation is close to one. Both linear and nonlinear mappings are robust against variations in SNR.

Highlights

  • We study the problem of transmitting a multivariate Gaussian source over orthogonal additive white Gaussian noise channels with joint source channel codes (JSCC), where the source and channel dimensions, M, are equal

  • We require the JSCC to operate on a symbol-by-symbol basis

  • (2) The multivariate Gaussian is communicated as M distributed sensor nodes with correlated measurements in a sensor network

Read more

Summary

Introduction

We study the problem of transmitting a multivariate Gaussian source over orthogonal additive white Gaussian noise channels with joint source channel codes (JSCC), where the source and channel dimensions, M , are equal. It was proven in [1] that distributed lossless coding of finite alphabet correlated sources can be as rate efficient as with full collaboration between the sensor nodes This result assumes no restriction on complexity and delay. Considering Scenario 1, the bounds can be found by equating the rate-distortion function for vector sources with the Gaussian channel capacity These bounds can be achieved by separate source and channel coding (SSCC), assuming infinite complexity and delay. Considering Scenario 2, the bound is determined, in the case of two sensor nodes, by combining the rate-distortion region in [4,5] with the Gaussian channel capacity This bound is achieved through SSCC by vector quantizing each source, applying Slepian-Wolf coding [1], followed by capacity achieving channel codes [7].

Problem Formulation and Performance Bounds
Distortion Bounds
Distributed Linear Mapping
Cooperative Linear Mapping
Nonlinear Mappings
Power and Distortion Formulation
Reconstruction of Common Information
Reconstruction of Common Information and Individual Contributions
Power and Distortion Calculation for Collaborating Encoders
Power and Distortion Calculation for Distributed Encoders
Extensions
Normal Vector for Archimedes Spiral
Metric Tensor
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call