Abstract

This paper compares the complexity of the sphere decoder (SD) and a previously proposed detection scheme, denoted here as block SD (BSD), when they are applied to the detection of multiple-input multiple-output (MIMO) systems in frequency-selective channels. The complexity of both algorithms depends on their preprocessing and tree search stages. Although the BSD was proposed as a means of greatly reducing the complexity of the preprocessing stage of the SD, no study was done on how the complexity of the tree search stage could be affected by that reduced preprocessing stage. This paper shows, both analytically and through simulation, that the reduction in preprocessing complexity provided by the BSD has the side effect of increasing the complexity of its tree search stage compared to that of the SD, independent of the signal-to-noise ratio (SNR). In addition, this paper shows how sorting the columns of the frequency-selective channel matrix in the SD does not reduce the complexity of the tree search stage, contrary to what occurs in frequency-flat channels.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call