Abstract

Light fidelity (LiFi) uses different forms of orthogonal frequency division multiplexing (OFDM), including DC biased optical OFDM (DCO-OFDM). In DCO-OFDM, the use of a large DC bias causes optical power inefficiency, while a small bias leads to higher clipping noise. Hence, finding an appropriate DC bias level for DCO-OFDM is important. This paper applies machine learning (ML) algorithms to find optimum DC-bias value for DCO-OFDM based LiFi systems. For this, a dataset is generated for DCO-OFDM using MATLAB tool. Next, ML algorithms are applied using Python programming language. ML is used to find the important attributes of DCO-OFDM that influence the optimum DC bias. It is shown here that the optimum DC bias is a function of several factors including, the minimum, the standard deviation, and the maximum value of the bipolar OFDM signal, and the constellation size. Next, linear and polynomial regression algorithms are successfully applied to predict the optimum DC bias value. Results show that polynomial regression of order 2 can predict the optimum DC bias value with a coefficient of determination of 96.77% which confirms the effectiveness of the prediction.

Highlights

  • IntroductionData Availability Statement: All relevant data are within the manuscript and its Supporting information files

  • Accepted: November 1, 2021Published: November 23, 2021

  • We identify the important features from the DCO-orthogonal frequency division multiplexing (OFDM) signal and train the model using several machine learning (ML) algorithms to find the optimum value of DC bias resulting in a low bit error rate (BER)

Read more

Summary

Introduction

Data Availability Statement: All relevant data are within the manuscript and its Supporting information files. ML is being considered in traditional RF wireless communication aspects, there is little work of ML in the context of OWC or LiFi. For the case of finding DC bias of DCO-OFDM based LiFi, ML can be used to identify patterns, distribution and trends in the data samples of a DCO-OFDM system. We identify the important features from the DCO-OFDM signal and train the model using several ML algorithms to find the optimum value of DC bias resulting in a low bit error rate (BER). A discussion is provided on the the existing work in the field of DCO-OFDM based LiFi. The section analyzes how to find optimum DC-bias using ML algorithms. In DCO-OFDM, a DC bias is applied to the bipolar signal and negative peaks are clipped at zero. In order to recover the transmitted data, the receiver does operations that are inverse to the actions of the transmitter block

Related work
6: Totalbits 0
Findings
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call