Abstract
In this study a supervised classifier based on the kernel implementation of the Bayes rule is introduced. The proposed technique first suggests an implicit nonlinear transformation of the data into a feature space and then seeks to fit normal distributions having a common covariance matrix onto the mapped data. The use of kernel concept in this process gives us the flexibility required to model complex data structures that originate from a wide-range of class conditional distributions. Although the decision boundaries in the new feature space are piece-wise linear, these corresponds to powerful nonlinear boundaries in the original input space. For the data we considered we have obtained some encouraging results.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have