Abstract

Recently, multiple-input–multiple-output (MIMO) orthogonal frequency-division multiplexing (OFDM) systems have gained significant attention in the field of wireless communications. The utilization of the Riemannian manifold has become prevalent in MIMO-OFDM systems. However, the existing data detection algorithms for MIMO-OFDM systems are mostly designed for block fading channels. Additionally, these algorithms often suffer from high computational complexity. In this paper, we propose a data detection algorithm on the basis of Riemannian manifold optimization for MIMO-OFDM systems under time-varying channels. The core concept of this algorithm is to optimize the transmitted signals by solving the manifold optimization problem in the case of time-varying channels. In order to reduce the computational complexity of the algorithm, we improve the proposed algorithm by dividing the transmitted signals into multiple subframes for solving the optimization problem separately and using the pilots to maintain the performance of the algorithm. In the simulation, the performance of multiple proposed algorithms and the forced-zero detection algorithm under different parameter settings are compared. The simulation results show that the proposed algorithm demonstrates good bit error rate and computational complexity performances.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.