Abstract

It has been shown that for optical communication receivers with large, signal-dependant noise components (multiplicative noise), the optimum detection threshold can be derived from a Bayes' Likelihood Ratio Test (LRT); however, the mean and variance of the bit levels must be known to obtain the order of magnitude bit-error-rate (BER) improvement over the typical matched filter type detector which assumes equal variances of the bit levels. In free-space communication systems, atmospheric conditions can cause variations in optical transmission and subsequently in the bit level means and variances. The bit level means and variances must be tracked and estimated and the detection threshold updated at a rate greater than the frequency of atmospheric changes, or the BER performance may actually be worse than that of the equal-variance threshold. Adaptive thresholding methods have been proposed and developed which track the bit means and variances and update the detection threshold to maintain near optimum performance. In this paper, simulated data based on actual optical receiver component characteristics and measured average received power data containing atmospheric turbulence induced fluctuations are used to test the tracking and BER performance of adaptive thresholding algorithms. The results of simulations comparing performance of three adaptive methods, maximum likelihood estimation/prediction, Kalman filter predictor/smoother, and a Least-Mean-Square (LMS) adaptive predictor, will be presented.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call