Abstract

Concentrating on cell-free massive multiple-input multiple-output (MIMO) networks with low-resolution analog-to-digital converters (ADCs), we derive the theoretical achievable uplink spectral efficiency (SE) for the minimum mean-squared error (MMSE)-based large-scale fading decoding (LSFD). Moreover, we investigate the bit allocation (BA) among access points (APs) to maximize the asymptotic sum SE under the constraint of total ADC quantization bits. Simulation results confirm that our theoretical analysis is accurate and validate that the proposed BA technique is preferable to the random and fixed counterparts.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call