Abstract

Scheduling procedures implemented in wireless networks consists of varied workflows such as resource allocation, channel gain improvement, and reduction in packet arrival delay. Among these techniques, Long term evolution (LTE) scheduling is most preferable due to its high speed communication and low bandwidth consumption. LTE allocates resources to the workflow based on time and frequency domains. Normally, the information gathered prior to scheduling increases the processing time since each attributes of the users have to be verified. In order to solve this issue, parallel processing via data mining is analyzed in recent research studies. The label that is assigned to the user attributes contributes primarily on scheduling time slots effectively. The label assignment and parallel processing via data mining reduces the delay and increases the throughput respectively. Additionally, the matched data extraction from the library and the prediction of available channels with fewer dimensions posed major challenges in the LTE scheduling. This paper surveys about various LTE scheduling algorithms, dimensionality reduction techniques, optimal feature selection techniques, multi-level classification techniques, and data mining combined with LTE techniques. A brief survey illustrates the impact of each technique on 3G/4G networks, channel availability prediction, scheduling of time slots in detail. A brief comparison of the techniques involved in the respective LTE processes via tabular form reveals that the verification of channel and user availability are the primary functions of the LTE scheduling. The survey of this paper identifies the limitations such as computational complexity and poor scheduling performance in the existing systems and encourages researchers to develop novel algorithms for LTE scheduling.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call