Abstract

Mixture of experts (ME) models comprise a family of modular neural network architectures aiming at distilling complex problems into simple subtasks. This is done by deploying a separate gating module for softly dividing the input space into overlapping regions to be each assigned to one or more expert networks. Conversely, support vector machines (SVMs) refer to kernel-based methods, neural-network-alike models that constitute an approximate implementation of the structural risk minimization principle. Such learning machines follow the simple, but powerful idea of nonlinearly mapping input data into high-dimensional feature spaces wherein a linear decision surface discriminating different regions is properly designed. In this work, we formally characterize and empirically evaluate a novel approach, named as Mixture of Support Vector Machine Experts (MSVME), whose main purpose is to combine the complementary properties of both SVM and ME models. In the formal characterization, an algorithm based on a maximum likelihood criterion is considered for the MSVME training, and we demonstrate that it is possible to train each expert based on an SVM perspective. Regarding the empirical evaluation, simulation results involving nonlinear dynamic system identification problems are reported, contrasting the performance shown by the MSVME approach with that exhibited by conventional SVM and ME models.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.