Abstract

The inherent impairments of visible light communication (VLC) in terms of nonlinearity of light-emitting diode (LED) and the optical multipath restrict bit error rate (BER) performance. In this paper, a model-driven deep learning (DL) equalization scheme is proposed to deal with the severe channel impairments. By imitating the block-by-block signal processing block in orthogonal frequency division multiplexing (OFDM) communication, the proposed scheme employs two subnets to replace the signal demodulation module in traditional system for learning the channel nonlinearity and the symbol de-mapping relationship from the training data. In addition, the conventional solution and algorithm are also incorporated into the system architecture to accelerate the convergence speed. After an efficient training, the distorted symbols can be implicitly equalized into the binary bits directly. The results demonstrate that the proposed scheme can address the overall channel impairments efficiently and can recover the original symbols with better BER performance. Moreover, it can still work robustly when the system is complicated by serious distortions and interference, which demonstrates the superiority and validity of the proposed scheme in channel equalization.

Highlights

  • Accepted: 14 October 2021Visible light communication (VLC) is a promising technique for indoor short-range wireless communications systems [1,2]

  • To show the feasibility of the proposed scheme in joint channel estimation and symbol detection, several simulations of the proposed scheme over the intensity modulation and direct detection (IM/DD) channel are conducted under different training conditions to investigate the corresponding convergence and bit error rate (BER) performance

  • It should be noted that the number of neurons in the input and output layers is determined by the subcarriers N and that of the output layer depends on the modulation level Q

Read more

Summary

Introduction

Visible light communication (VLC) is a promising technique for indoor short-range wireless communications systems [1,2]. Machine learning (ML)-related algorithms have demonstrated the ability to solve the nonlinear issues [20,21,22,23], e.g., support vector machine (SVM) and K-means algorithm, can be adopted to estimate various channel impairments and to accurately identify the complex mapping relationship between the input and output signals. In [36], a model-driven DL approach using an autoencoder (AE) network is proposed to mitigate the LED nonlinearity for DCO-OFDM-based VLC systems. In this paper, inspired by the approaches in [26,27,28,29,30,31], we formulate the channel impairment mitigation problem as a learning task and propose a model-driven DL scheme, abbreviated as DL-NPE, for the DCO-OFDM-based VLC system. Let N μ, σ2 be the Gaussian distribution with mean μ and variance σ2

OFDM-Based VLC System
Inherent
The Proposed Scheme
System Architecture
Training Specification
Complexity Analysis
Simulation Results
The Convergence Performance
The BER Performance
Impact of Clipping
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call