Abstract

Functional data analysis has become a research hotspot in the field of data mining. Traditional data mining methods regard functional data as a discrete and limited observation sequence, ignoring the continuity. In this paper, the functional data classification is addressed, proposing a functional generalized eigenvalue proximal support vector machine (FGEPSVM). Specifically, we find two nonparallel hyperplanes in function space, a positive functional hyperplane, and a functional negative hyperplane. The former is closest to the positive functional data and furthest from the negative functional data, while the latter has the opposite properties. By introducing the orthonormal basis, the problem in function space is transformed into the ones in vector space. It should be pointed out that the higher-order derivative information is applied from two aspects. We apply the derivatives alone or the weighted linear combination of the original function and the derivatives. It can be expected that to improve the classification accuracy by using more data information. Experiments on artificial datasets and benchmark datasets show the effectiveness of our FGEPSVM for functional data classification.

Highlights

  • Functional data analysis has become a research hotspot in the field of data mining

  • We find that functional generalized eigenvalue proximal support vector machine (FGEPSVM) is sensitive to δ

  • This paper proposes a functional generalized eigenvalue proximal support vector machine, which looks for two nonparallel hyperplanes in function space, making each functional hyperplane close to the functional data in one class and far away from the functional data in the other class

Read more

Summary

Related Work

Throughout this paper, uppercase and lowercase characters are used to denote matrices and vectors. Rm×n and Rn denote the set of m × n matrices and n-dimensional vectors, respectively. The inner product of two vectors x and y in the n-dimensional real space Rn is defined by < x, y >= x T y. The norm in this paper refers to the 2-norm of n a vector, which is denoted by k x k = ( ∑ xi2 ) 2 , where x = Our work is closely related with the generalized eigenvalue proximal support vector machine (GEPSVM) [32]. The positive plane (2) is closest to the data points of class. Note that the objective functions in (10) and (11) are known as the Rayleigh quotients, which can be transformed into a pair of generalized eigenvalue problems.

Functional GEPSVM
Numerical Experiments
Artificial Datasets
Benchmark Datasets
Parameter Sensitivity Analysis
Findings
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call