Abstract
This paper studies the downlink performance of a cellular-based internet-of-things (IoT) network where the receiving devices are solely powered by energy harvested from the ambient radio frequency (RF). Assuming that the cellular network is the only source of RF energy, we consider a time-division based approach for power and information transfer where each time slot is partitioned into two sub-lots: (i) charging sub-slot during which the base stations (BSs) act as chargers for the devices, and (ii) information sub-slot in which the devices receive information using energy harvested during the charging sub-slot. Modeling the BS locations as a Poisson Point Process (PPP), we study a new composite outage probability metric that considers the joint effect of outages due to insufficient energy harvested during the charging sub-slot and low signal quality in the information sub-slot. Using this metric, we concretely demonstrate the existence of an optimum downlink time slot division between charging and information transmission that maximizes the average downlink throughput for a given user.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.