Abstract

Images have large data quantity. For storage and transmission of images, high efficiency image compression methods are under wide attention. In this paper we propose a neuro- wavelet based model for image compression which combines the advantage of wavelet transform and neural network. Images are decomposed using wavelet filters into a set of sub bands with different resolution corresponding to different frequency bands. Different quantization and coding schemes are used for different sub bands based on their statistical properties. The coefficients in low frequency band are compressed by differential pulse code modulation (DPCM) and the coefficients in higher frequency bands are compressed using neural network. Using this scheme we can achieve satisfactory reconstructed images with large compression ratios.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call