Abstract

The primary purpose of this article is to demonstrate experimentally and explain theoretically (based on physics-based literature and empirical-based industry-standard models) that, in the limit where the receiver height vanishes above an imperfectly conducting plane such as the surface of the Earth, the propagation characteristics of 4G (800 MHz) signals in a typical cellular wireless setup behave very simply as the two-ray-plane Earth model, with approximately 40 dB/decade of signal loss, for almost the entire cell, in contrast to the more common case of finite height receivers with more complex path loss scaling with distance from the transmitter to the receiver. The simple reason behind this is that the “breakpoint” distance, beyond which all models predict this power law, vanishes as the height of the receive antenna vanishes, leading to the unified theory of signal propagation for ground-level receivers in the microwave regime.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call