Abstract

This paper presents an importance sampling simulation model which analyzes a communications system consisting of a noisy channel, a transmitter/receiver, and a convolutional encoder/Viterbi decoder. The model determines the amount of signal degradation caused by any noise environment that can be modeled as a Markov chain. The specific example of a Radio Frequency Interference (RFI) noise environment is discussed in detail. The model uses importance sampling to determine low bit error rates (BERs) for a wide range of noise environments. It is faster than a conventional simulation because the required run time is independent of the BER. It is more flexible than existing analytic models, as these make major assumptions, such as that symbol errors are independent (interleaving), or that all bursts have infinite power. The model increases the simulation efficiency by biasing the channel statistics so that more codeword errors occur and adjusts for this by using a weighting function whose value is calculated through the use of a Markov chain. The results show that using interleaving results in a significant performance improvement when the lengths of the interfering bursts are long relative to the data symbol length.< <ETX xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">&gt;</ETX>

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call