Abstract

In this study, a new algorithm for classification of ground vehicles from standard synthetic aperture radar (SAR) images is proposed. Radial Chebyshev moment (RCM) is a discrete orthogonal moment that has distinctive advantages over other moments for feature extraction. Unlike invariant moments, its orthogonal basis leads to having minimum information redundancy, and its discrete characteristics explore some benefits over Zernike moments (ZM) due to having no numerical errors and no computational complexity owing to normalization. In this context, we propose to use RCM as the feature extraction mechanism on the segmented image and to compare results of the fused images with both Zernike and radial Chebyshev moments. Firstly, by applying different threshold target and shadow parts of each SAR images are extracted separately. Then, segmented images are fused based on the combination of the extracted segmented region, segmented boundary and segmented texture. Experimental results will verify that accuracy of RCM, which improves significantly over the ZM. Ten percent improvement in the accuracy is obtained by using RCM and fusion of segmented target and shadow parts. Furthermore, feature fusion improves the total accuracy of the classification as high as 6%.

Highlights

  • Synthetic aperture radar (SAR) with very high-resolution images plays a crucial role in automatic target recognition (ATR) for their robust ability to work in all weather conditions during day and night in different applications such as homeland security, surveillance and military tasks [1,2,3,4,5].Moving stationary target acquisition and recognition (MSTAR), a standard SAR-ATR database [6], is used for the testing and validation of different algorithms

  • A comparison based on the feature extraction techniques was done between Zernike moment (ZM) and Radial Chebyshev moment (RCM)

  • We developed a feature extraction algorithm using radial Chebyshev moments and compared it with a commonly used method called Zernike moments

Read more

Summary

Introduction

Synthetic aperture radar (SAR) with very high-resolution images plays a crucial role in automatic target recognition (ATR) for their robust ability to work in all weather conditions during day and night in different applications such as homeland security, surveillance and military tasks [1,2,3,4,5].Moving stationary target acquisition and recognition (MSTAR), a standard SAR-ATR database [6], is used for the testing and validation of different algorithms. Due to the noisy background of SAR images, and in order to extract the useful information, various preprocessing techniques are introduced in the literature [7,8,9]. Different approaches for feature extraction have been introduced for SAR image target recognition. The problem associated with these techniques is that generally they are very sensitive to speckle noise [15] and are rotation variant. Hus invariant moments are the simplest method for generating shape descriptors [16]. They are rotational invariant, they suffer from a high degree of information redundancy since the bases are not orthogonal [17]. Zernike polynomials are rotation invariant with its robustness to speckle noise and having a minimum information redundancy since the basis is orthogonal

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call