Abstract

Delivering Gbps rate to users at 100-meter range is challenging and using large phased arrays and wide bandwidth in the mmWave range are considered to be a high potential solution for 5G and beyond. We evaluate the minimum per-element power requirement of phased arrays to support 1 Gbps rate at mmWave bands by quantifying the impact of channel estimation penalty, path loss, array size, effective beamforming gain saturation due to angular spread, and channel coherence. With 1 GHz bandwidth, a minimum 18 dBm per-element power is required to deliver 1 Gbps downlink rate from an urban Macro base station to outdoor users 100 m away in non-line-of-sight (NLOS) using a transmit array of 4x4, 7x7, or 10x10 elements at 28, 90 or 144 GHz band, respectively, assuming 5 dBi element gain. Using more bandwidth would lower the per- element power requirement by less than 2 dB in most cases. Using larger antenna arrays could be beneficial in lowering the per-element power requirement, as long as the associated beam searching and channel estimation overhead does not surpass the benefit of improved spectral efficiency. For antenna systems having a strict equivalent isotropic radiated power (EIRP) limit, such as indoor access points or modems, the benefit of using larger arrays at both ends at higher bands is curbed by the effective gain reduction caused by angular spread, increased beam acquisition overhead, worse-than-free-space path loss and body blockage. Therefore, it is challenging for indoor access points to provide high-speed transmission at high bands to NLOS wearable devices such as augmented reality goggles.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call