Abstract

With fast learning speed and high accuracy, extreme learning machine (ELM) has achieved great success in pattern recognition and machine learning. Unfortunately, it will fail in the circumstance where plenty of labeled samples for training model are insufficient. The labeled samples are difficult to obtain due to their high cost. In this paper, we solve this problem with transfer learning and propose joint transfer extreme learning machine (JTELM). First, it applies cross-domain mean approximation (CDMA) to minimize the discrepancy between domains, thus obtaining one ELM model. Second, subspace alignment (sa) and weight approximation are together introduced into the output layer to enhance the capability of knowledge transfer and learn another ELM model. Third, the prediction of test samples is dominated by the two learned ELM models. Finally, a series of experiments are carried out to investigate the performance of JTELM, and the results show that it achieves efficiently the task of transfer learning and performs better than the traditional ELM and other transfer or nontransfer learning methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.