Abstract

This paper discusses feature extraction methods. The feature extraction methods such principal component analysis and multiple discriminant analysis are very important techniques in machine learning research areas. The characteristic of feature extraction is to transform the data from a difficultly classified space to a easily classified space. There are many conventional machine learning methods including transformation such as artificial neural network and support vector machines. However, extracting the good features before applying machine learning methods will lead to better classification results. This paper focuses on the principal component regression (PCR). The PCR finds the approximation with hyper-planes where the data distributed on. The problem now is that it has a case of the data do not distribute on hyper-planes, for example they distribute on hyper-spheres such as rotation objects, the PCR can not extract the good feature to apply the classification problems. This paper proposes a new feature extraction method by calculating the conformal eigenvectors in conformal geometric algebra(CGA) space to find the approximation hyper-planes or hyper-spheres which fit to the set of data using the least square approach. In particular, this paper shows that the classification accuracy of proposed method is better than that of conventional PCR method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call