The demand for data transmission is rising expensively for the applications of biomedical sensors data, multimedia technologies, and ultrahigh-definition online video streaming. Such applications require larger bandwidth with minimum latency and seamless service delivery. Radio-over-fiber (RoF), integrated with wavelength division multiplexing (WDM) technology, is being considered one of the promising technologies. However, the integration of optical fiber and wireless communication also generates non-linear effects as and when the number of users increases. That results in the introduction of signal noise, unwanted frequencies, low quality of signals, and increased latency. In this paper, a 16-channel 160 Gbps data rate WDM-based RoF system has been simulated and evaluated for optimum performance at a variable input power level, from 5 to −15 dBm, with the application of dispersion compensation fiber (DCF) and fiber Bragg grating (FBG), with channel spacing of 50 and 100 GHz. The performance of the system is evaluated with the existing WDM-RoF system. The performance metrics parameters chosen for evaluation are bit error rate (BER), quality factor (Q-factor), and eye diagrams and simulated on opti-system simulator. The optimum performance has been observed at a power level of −5 dBm for all these elected evaluation parameters. It has also been observed that, for channel spacing of 100 GHz, the network performed better in comparison with 50 GHz.
Read full abstract