Abstract

Multiple instance learning (MIL) has been studied actively in recent years. However, it is facing a computational challenge due to the large scale of data volume. Parallel computing is a good way of overcoming the computational challenge. In this paper, we propose a new MIL method based on a MIL back-propagation neural network (MIBP), which is an extension of the standard back-propagation neural network (BPNN) that uses labeled bags of instances as training data. We use parallel computing to speed up the learning process. The proposed method finds a concept point t in the feature space which is close to instances from positive bags and far from instances in negative bags. The description of our method is as follows: First, train MIBP with positive and negative bags. Second, extract t from the trained MIBP. This is achieved by, for each positive bag, presenting all the instances to the trained MIBP and selecting the one with maximal output value. The t is then obtained by averaging all the extracted instances. Finally, a sensitivity analysis of the trained MIBP is performed to obtain feature relevance/weighting information. Parallel computing is performed during the training of the MIBP. We conduct experiments to measure the performance of the obtained t when used for classification purposes and evaluate the parallel computing method. The experimental results on the MUSK data set show that our method has better classification performance and is more computationally efficient than other well-established MIL methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.