Abstract

A theory is developed for jointly minimizing the bit error rate (BER) between the desired and decoded signals with respect to the coefficients of transmitter and receiver finite impulse response (FIR) multiple-input multiple-output (MIMO) filters. The original signal is assumed to be a vector time-series with equally likely memoryless Bernoulli vector components. The channel model constitutes of a known FIR MIMO transfer function and Gaussian additive noise independent of the original signal. The channel input signal is assumed to be power constrained. Based on the formulas obtained, an iterative numerical optimization algorithm is proposed. When compared with other design methods available in the literature, the proposed method yields better results due to the generality of the model considered and the joint optimization of the transmitter-receiver pair.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call