A technique has recently been developed whereby it is possible, without undue complexity and without any adaptive linear prefiltering, to achieve near-maximum-likelihood detection of a sampled digital signal, where there is intersymbol interference extending over several samples of the signal. It is, however, important here that the channel impulse response does not undergo large changes with time, since, when it does, a considerable increase in the complexity of the system may be required to maintain correct operation. The paper describes a development of the detection process whereby correct operation is achieved with a relatively simple system, even when the channel introduces severe frequency-selective fading of the type sometimes experienced over HF radio links. Results of computer-simulation tests are presented, showing the tolerance of a synchronous serial data-transmission system to additive white Gaussian noise, when a 4-point quadrature amplitude modulated signal is transmitted at 2400 bit/s over a model of an HF radio link, with two independent Rayleigh fading sky waves and frequency spreads of 0.5, 1 and 2Hz, and when the novel detection process is used at the receiver. Correct estimation of the channel is assumed throughout.
Read full abstract