Abstract

Characterization of temporal variations in wireless channel impairments plays an important role in the design of a reliable and efficient mobile communication system. Such channels are termed as fading channels since various random phenomena in the propagation path result in fading of the received signal envelope. Simulation models are often used to estimate channel behavior due to higher complexity in developing analytical models for such channels. In this paper, Finite State Markov Model is developed for the evaluation of multi-path fading channel. Cumulative states and frequency duration analysis approach is used for computing rate of transition between satisfactory states and outage states, and outage time, of the channel. Simulation results obtained for Rayleigh channel are then used to find correspondence between fade depth and outage time for a sample system. These results may be used in the estimation of bit error rate and deciding optimum sampling instant of the received signal. It will also assist in selection of appropriate channel code and interleaver design for a future wireless networks with enhanced channel capacity.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call