A general model for multisource classification of remotely sensed data based on Markov random fields (MRF) is proposed. A specific model for fusion of optical images, synthetic aperture radar (SAR) images, and GIS (geographic information systems) ground cover data is presented in detail and tested. The MRF model exploits spatial class dependencies (spatial context) between neighboring pixels in an image, and temporal class dependencies between different images of the same scene. By including the temporal aspect of the data, the proposed model is suitable for detection of class changes between the acquisition dates of different images. The performance of the proposed model is investigated by fusing Landsat TM images, multitemporal ERS-1 SAR images, and GIS ground-cover maps for land-use classification, and on agricultural crop classification based on Landsat TM images, multipolarization SAR images, and GIS crop field border maps. The performance of the MRF model is compared to a simpler reference fusion model. On an average, the MRF model results in slightly higher (2%) classification accuracy when the same data is used as input to the two models. When GIS field border data is included in the MRF model, the classification accuracy of the MRF model improves by 8%. For change detection in agricultural areas, 75% of the actual class changes are detected by the MRF model, compared to 62% for the reference model. Based on the well-founded theoretical basis of Markov random field models for classification tasks and the encouraging experimental results in our small-scale study, the authors conclude that the proposed MRF model is useful for classification of multisource satellite imagery.
Read full abstract