This study describes how machine learning methods can be used to enhance the performance of LiFi (light fidelity) systems that are based on DCO-OFDM (direct current bias optical orthogonal frequency division multiplexing). LiFi, a rapidly developing technology that uses light for wireless communication, has to contend with issues like channel impairments and signal interference. This work investigates several machine learning strategies to address these problems and improve the DCO-OFDM LiFi systems' overall performance. It is determined how well various machine learning techniques perform in terms of improving system parameters, decreasing latency, and raising throughput by analyzing real-world data and simulation studies. Various sorts of symmetrical recurrence division multiplexing (OFDM, for example, DC one-sided optical OFDM (DCO-OFDM), are utilized in light devotion (LiFi). A critical DC predisposition in DCO-OFDM brings about optical power failure, while a minor inclination increment cutting commotion. In this manner, it's essential to decide the appropriate DC predisposition level for DCO-OFDM. This examination finds the ideal DC-predisposition an incentive for DCO-OFDM based LiFi frameworks utilizing machine learning (ML) algorithms. To do this, a MATLAB device is utilized to create a dataset for DCO-OFDM. Thusly, Python writing computer programs is utilized to apply machine learning methods. ML is used to recognize the basic DCO-OFDM qualities that influence the ideal DC predisposition. Here, it is exhibited that the base, greatest, and standard deviation of the bipolar OFDM signal, as well as the size of the heavenly body, decide the ideal DC inclination.