Abstract

Infrared spectral super-resolution has achieved great success for spectral signals under the noise-free case. However, the random noise and band overlap restricted the super-resolution performance in the real scenarios. Infrared spectral data acquisition is slow because weak signals (band overlap and random noise) limits its wide applications. To address this problem, a novel spectral super-resolution model is proposed with fast linear canonical transform (LCT) regularization, which can effectively improve the resolution of noisy and low-resolution spectrum. To reveal the sparsity difference between the observed infrared spectra and the real one, the LCT tool is utilized to distinguish the various coefficients distributions. The distribution of the real Infrared spectra is sparser than the observed infrared one, thus, an infrared spectral restoration model is proposed to regularize the sparsity distribution of the observed spectra by L0-norm. Experimental results on simulated and real spectral signals show that the proposed algorithm can effectively preserve the details of infrared spectrum and suppress random noise.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.