Abstract

The authors study the performance of detection algorithms based on linear time-frequency (or time-scale) transforms, for transient signals characterized by linear and nonlinear parametric signal models, and by a 'model mismatch' representing the difference between the model and the actual signal. The transients are assumed to undergo a noninvertible linear transformation prior to the application of the detection algorithm. Examples of such transforms include the short-time Fourier transform and the wavelet transform. Closed-form expressions are derived for the worst-case detection performance for all possible mismatch signals of a given energy. These expressions make it possible to evaluate and compare the performance of various transient detection algorithms, for both single- and multichannel data. Both linear and nonlinear signal models are considered. >

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call