Abstract

We propose a neural network architecture based on the oblique elliptical basis function for supervised learning problems. In classification, a category can be a disconnected or non-convex region involving several overlapping or disjoint sub-regions of the feature space. Other existing supervised learning methods may have the restriction that only allows decision regions to be convex. Our proposed method overcomes this restriction by employing a rotational self-constructing clustering algorithm to decompose the feature space into a collection of sub-regions which can then be combined to make up individual categories. An unseen instance is classified to a certain category if its similarity to the category exceeds a threshold. The whole framework fits in a five-layer network consisting of input, component-similarity, cluster-similarity, aggregation, and output layers. A similar idea also applies to solving regression problems. A parameter learning algorithm based on least squares estimation is used to derive the weights of the underlying network. Our approach can offer some advantages in practicality. Through the incorporation of rotation, data can be clustered more appropriately than by standard elliptical basis functions. Also, our approach is applicable to single-label classification, multi-label classification, as well as regression problems. A number of experiments are conducted to show the effectiveness of the proposed approach.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.