Abstract

This work presents a theoretical framework for determining the optimal clipping level for a multiple input multiple output (MIMO) orthogonal frequency division multiplexing (OFDM) system. We start by modeling of the clipping noise, and propose a maximum likelihood (ML) receiver for the resulting signal. The bit error rate (BER) for this ML receiver is then derived for MIMO spatial multiplexing, space time coding, and cyclic delay diversity schemes. A search through the tradeoff space between the BER performance penalty and power amplifier (PA) power consumption advantage reveals optimum clipping levels of the system. Using an IEEE 802.11n like system, we show that the optimum clipping as predicted by this approach can improve PA power efficiency as much as 70%.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call