Abstract

Although few-shot meta learning has been extensively studied in machine learning community, the fast adaptation towards new tasks remains a challenge in the few-shot learning scenario. The neuroscience research reveals that the capability of evolving neural network formulation is essential for task adaptation, which has been broadly studied in recent meta-learning researches. In this paper, we present a novel forward-backward meta-learning framework (FBM) to facilitate the model generalization in few-shot learning from a new perspective, i.e., neuron calibration. In particular, FBM models the neurons in deep neural network-based model as calibrated units under a general formulation, where neuron calibration could empower fast adaptation capability to the neural network-based models through influencing both their forward inference path and backward propagation path. The proposed calibration scheme is lightweight and applicable to various feed-forward neural network architectures. Extensive empirical experiments on the challenging few-shot learning benchmarks validate that our approach training with neuron calibration achieves a promising performance, which demonstrates that neuron calibration plays a vital role in improving the few-shot learning performance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.