The combination of forward-error-correction (FEC) and interleaving can be used to improve free-space optical (FSO) communication systems. Recent research has optimized the codeword length and interleaving depth under the assumption of a fixed buffering size, however, how the buffering size influences the system performance remains unsolved. This paper models the system performance as a function of buffering size and FEC recovery threshold, which allows system designers to determine optimum parameters in consideration of the overhead. The modelling is based on statistics of temporal features of correct data reception and burst error length through the measurement of the channel good time and outage time. The experimental results show good coherence with the theoretical values. This method can also be applied in other channels if a Continuous-Time-Markov-Chain (CTMC) model of the channel can be derived
Read full abstract