Abstract
Deep ensemble learning, where we combine knowledge learned from multiple individual neural networks, has been widely adopted to improve the performance of neural networks in deep learning. This field can be encompassed by committee learning, which includes the construction of neural network cascades. This study focuses on the high-dimensional low-sample-size (HDLS) domain and introduces multiple instance ensemble (MIE) as a novel stacking method for ensembles and cascades. In this study, our proposed approach reformulates the ensemble learning process as a multiple-instance learning problem. We utilise the multiple-instance learning solution of pooling operations to associate feature representations of base neural networks into joint representations as a method of stacking. This study explores various attention mechanisms and proposes two novel committee learning strategies with MIE. In addition, we utilise the capability of MIE to generate pseudo-base neural networks to provide a proof-of-concept for a “growing” neural network cascade that is unbounded by the number of base neural networks. We have shown that our approach provides (1) a class of alternative ensemble methods that performs comparably with various stacking ensemble methods and (2) a novel method for the generation of high-performing “growing” cascades. The approach has also been verified across multiple HDLS datasets, achieving high performance for binary classification tasks in the low-sample size regime.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.