Abstract

Dimensionality reduction plays a major role in face recognition. Discriminant analysis (DA) and principal component analysis (PCA) are two of the most important approaches in this field. In particular, subclass discriminant analysis (SDA) is a well-known scheme for feature extraction and dimensionality reduction. It is widely used in many high-dimensional data-driven applications, namely face recognition and image retrieval. It is also found to be applicable under various scenarios. However, it has high cost in time and space given the need for an eigendecomposition involving the scatter matrices, known as the singularity problem. This limitation is caused by the high-dimensional space of data, particularly when dimensions exceed the number of observations. Recent advances widely reported that 2D methods with matrix-based representation perform better than the traditional 1D vector-based ones. In this paper, we propose a novel 2D-SDA algorithm to avoid the “curse of dimensionality” and address the singularity issue. The performance of the proposed algorithm is evaluated for face recognition in terms of recognition performance and computational cost. Experiments are conducted on four benchmark face databases and compared to several competitive 1D and 2D methods based on PCA and DA. Results show that 2DSVD achieves the best recognition performance at low dimensions. In particular, 2D-SDA works significantly better on large-sized data sets where intra-class variation is the most important.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call