Abstract

For efficient compression of hyperspectral images, we propose a universal Golomb–Rice coding parameter estimation method using deep belief network, which does not rely on any assumption on the distribution of the input data. We formulate the problem of selecting the best coding parameter for a given input sequence as a supervised pattern classification problem. Simulations on the synthesized data and five hyperspectral image datasets show that we can achieve significantly more accurate estimation of the coding parameters, which can translate to slightly higher compression than three state-of-the-art methods. More extensive simulations on additional images from the 2006 AVIRIS datasets show that the proposed method achieved overall compression bitrates comparable with other estimation methods, as well as the sample-adaptive entropy coder employed by the Consultative Committee for Space Data Systems standard for multispectral and hyperspectral data compression. Regarding computational feasibility, we show how to use transferable deep belief networks to speed up training by about five times. We also show that inferring the best coding parameters using a trained deep belief network offers computational advantages over the brute-force search method. As an extension, we propose a novel side-information free codec, where the intersequence correlations can be learned by a differently trained network based on the current sequence to predict reasonably good parameters for coding the next sequence. As another extension, we introduce a variable feature combination architecture, where problem specific heuristics such as the sample means can be incorporated to further improve the estimation accuracy.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.