Recent literature demonstrated promising results of Long-Term Evolution (LTE) deployments over unlicensed bands when coexisting with Wi-Fi networks via the Duty-Cycle (DC) approach. However, it is known that performance in coexistence is strongly dependent on traffic patterns and on the duty-cycle ON–OFF rate of LTE. Most DC solutions rely on static coexistence parameters configuration, hence real-life performance in dynamically varying scenarios might be affected. Advanced reinforcement learning techniques may be used to adjust DC parameters towards efficient coexistence, and we propose a Q-learning Carrier-Sensing Adaptive Transmission mechanism which adapts LTE duty-cycle ON–OFF time ratio to the transmitted data rate, aiming at maximizing the Wi-Fi and LTE-Unlicensed (LTE-U) aggregated throughput. The problem is formulated as a Markov decision process, and the Q-learning solution for finding the best LTE-U ON–OFF time ratio is based on the Bellman’s equation. We evaluate the performance of the proposed solution for different traffic load scenarios using the ns-3 simulator. Results demonstrate the benefits from the adaptability to changing circumstances of the proposed method in terms of Wi-Fi/LTE aggregated throughput, as well as achieving a fair coexistence.
Read full abstract