Abstract

In this paper we investigate the problem of timing synchronization for a linearly modulated burst transmission scheme. It has been assumed that there is a linear variation in symbols' timing offset, which is caused by sampling frequency offset. Moreover, passing through an Additive White Gaussian Noise (AWGN) channel, corrupts the received burst signal and exerts a fixed timing offset to the transmitted sequence. Unlike classical techniques that the timing offset is assumed to vary slowly with time and, therefore, it is considered as a constant parameter over the entire burst, the more challenging problem is addressed in this contribution where timing offset variation from symbol to symbol is not negligible. A unified gradient-based optimization algorithm to estimate both the sampling frequency and delay offset is presented that iteratively converge to the Maximum Likelihood (ML) estimation. Bit Error Rate (BER) Simulation results are derived and compared with perfect synchronization. The results confirm the system's efficiency.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call