In this paper a new Contourlet hidden Markov Tree (CHMT) and clarity-saliency driven Pulse Coupled Neural Network (PCNN) based fusion approach is proposed for remote sensing images fusion. Considering the failure of wavelet in representing the geometry of image edges and contours, we firstly use Contourlet to provide an efficient and flexible multiscale, multidirectional and anisotropy representation of remote sensing images, and then use CHMT model to describe the statistics of Contourlet coefficients, by taking the dependencies across scales, locations and directions into account. Because CHMT can span several adjacent directional subbands in the finer scales, which has similar statistical characteristic to scale, CHMT model can give more accurate description of images and has lower computational complexity than wavelet hidden Markov Tree (WHMT). Training the Contourlet coefficients of registered multisource images using Expectation Maximum (EM) algorithm, one can obtain the model parameters which is used to update the Contourlet coefficients. Low-frequency subbands are fused by the magnitude maximum rule. For the fusion of high-frequency directional subband, a PCNN is constructed based on the phenomena of synchronous pulse bursts in the animal visual cortexes, and the linking strength of each neuron in PCNN is determined by a new clarity-saliency measure of the subband images. New fire mapping images are obtained for each high-frequency subband taking part in the fusion. Some experiments are taken on some remote sensing images came from the USA Airborne Multisensor Pod System (AMPS) program, and the results show the superiorities of our proposed method to WHMT and Contourlets, both in subjective evaluation, implementation speed and some numerical guidelines.