Abstract

Due to the increasing demand for high-speed data transmission, wireless communication has become more advanced. Unfortunately, the various kinds of impairments that can occur when carrying data symbols through a wireless channel can affect the network performance. Some of the solutions that are proposed to address these issues include channel equalization, and that can be solved through machine learning techniques. In this paper, a hybrid approach is proposed that combines the features of tracking mode and training mode of adaptive equalizer. This method utilizes the concept of machine learning (ML) to classify different environments (highly, medium, low, open space cluttered) based on the measurements of their RF signal. The results of the study revealed that the proposed method can perform well in real-time deployments. The performance of ML algorithms namely Logistic Regression, KNN Classifier, SVM Classifier, Naive Bayes, Decision Tree classifier and Random Forest classifier is analyzed for different number of samples such as 10, 50 and 100. The performance of these algorithms is evaluated by comparing their accuracy, sensitivity, specificity, F1 score and Confusion Matrix. The objective of this study is to demonstrate that a single ML algorithm cannot perform well in all kinds of environments. In order to choose the best algorithm for a given environment, the decision device has to analyze the various factors that affect the performance of the system. For instance, the random forest classifier performed well in terms of accuracy (100 percent), specificity (100 percent), sensitivity (100 percent), and F1_score (100 percent). On the other hand, the logistic regression algorithm did not perform well in low cluttered environment.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call