Abstract
Often high resolution (HR) RGB images generated by sparse sampling of the visible spectrum fail to produce differentiable modality for computer vision tasks, Hence computer vision tasks have to rely on gross structures in an image like corners, edges etc. instead of just the recorded reflectance of objects or materials at each pixel in a scene. In contrast to RGB, hyperspectral imaging allows pixels to record reflectance of the scene over multiple contiguous bands, which results in rich differentiable modalities. However, hyperspectral imaging, despite having a growing number of applications from agriculture, surveillance, mineralogy, food processing to eye care, is hitherto restricted to low spatial resolution imaging due to sensor hardware limitations. In this paper, we propose a hyperspectral super resolution technique to produce a high resolution (HR) hyperspectral image with a spectral support of 400–1020 nm from a low resolution (LR) hyperspectral image of the same spectral support and a high resolution multispectral (RGB) image with reduced spectral support of 400–700 nm. In the first step, we generate a HR hyperspectral prior by estimating HR hyperspectral band images in the spectral support of 400–700 nm by detail transfer and alternating iterative minimization. In the next step, we use the generated HR prior to further estimate the HR hyperspectral images for 710–1020 nm bands by learning a non-negative dictionary of reflectance spectra signatures of all the materials present in the scene from the LR hyperspectral image with spectral support in 400–1020 nm. With the estimated HR hyperspectral prior and learned dictionary, we predict the non-negative sparse codes for HR hyperspectral band images in the band of 710–1020 nm.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.