Two-dimensional principal component analysis (2DPCA), known to be very sensitive to outliers, employs the square F -norm as the distance metric and only satisfies the optimal objective of maximizing projection variance. However, the objective of minimizing reconstruction errors for all samples is not optimized as much as possible. To handle the problem, a novel cosine objective function is first presented for maximizing weighted projection, in which the 2-norm of vectors with an adjustable power parameter is employed as the distance metric. Not only the objective with the maximum projection distance is accomplished in the cosine objective function, but also the objective with the minimum sum of reconstruction errors is also optimized indirectly. Then, the cosine 2DPCA (Cos-2DPCA) method is proposed, and the greedy iterative algorithm to solve Cos-2DPCA is also developed. The convergence and correlation of solutions are proved theoretically and discussed in detail. Finally, the series of experiments are carried out on the artificial dataset and eight standard datasets, respectively. The results demonstrate that the performances of Cos-2DPCA are significantly improved on the reconstruction, correlation, complexity, and classification, and it outperforms most of the existing robust 2DPCA methods.
Read full abstract