Abstract

The problem of digital signal recognition has been considered in conditions of deforming distortions of the waveform of these signals and additive Gaussian noise. A mathematical model for introducing deformations of the known or random waveform signals is proposed for synthesizing recognition algorithms. The model is based on introducing the nonlinear deformation operator as an operator of permutations with repetitions of elements of the initial discrete signal with addition of additive noise component caused by quantization errors of continuous deformation function. Two recognition algorithms were synthesized and investigated. The first is an optimal one based on the exact calculation of likelihood functions, and the second is a quasi-optimal algorithm based on using the Gaussian approximation of likelihood functions. These algorithms were simulated for different variants of the specified values of deforming distortions in the form of determinate functions and in the form of random function realizations. The experimental error probability was compared with its theoretical estimate at different values of signal-to-noise ratio.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call