Abstract

This research work models two methods together to provide maximum information about a study area. The quantification of image texture is performed using the “grey level co-occurrence matrix (mathrm{GLCM})” technique. Image classification-based “object-based change detection (mathrm{OBCD})” methods are used to visually represent the developed transformation in the study area. Pre-COVID and post-COVID (during lockdown) panchromatic images of Connaught Place, New Delhi, are investigated in this research work to develop a model for the study area. Texture classification of the study area is performed based on visual texture features for eight distances and four orientations. Six different image classification methodologies are used for mapping the study area. These methodologies are “Parallelepiped classification (mathrm{PC}),” “Minimum distance classification (mathrm{MDC}),” “Maximum likelihood classification (mathrm{MLC}),” “Spectral angle mapper (mathrm{SAM}),” “Spectral information divergence (mathrm{SID})” and “Support vector machine (mathrm{SVM}).” mathrm{GLCM} calculations have provided a pattern in texture features contrast, correlation, mathrm{ASM}, and mathrm{IDM}. Maximum classification accuracy of 83.68% and 73.65% are obtained for pre-COVID and post-COVID image data through mathrm{MLC} classification technique. Finally, a model is presented to analyze before and after COVID images to get complete information about the study area numerically and visually.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.