Abstract

To enable effective learning of new tasks with only a few examples, meta-learning acquires common knowledge from the existing tasks with a globally shared meta-learner. To further address the problem of task heterogeneity, recent developments balance between customization and generalization by incorporating task clustering to generate task-aware modulation to be applied to the global meta-learner. However, these methods learn task representation mostly from the features ofinput data, while the task-specific optimization process with respect to the base-learner is often neglected. In this work, we propose a Clustered Task-Aware Meta-Learning (CTML) framework with task representation learned from both features and learning paths. We first conduct rehearsed task learning from the common initialization, and collect a set of geometric quantities that adequately describes this learning path. By inputting this set of values into a meta path learner, we automatically abstract path representation optimized for downstream clustering and modulation. Aggregating the path and feature representations results in an improved task representation. To further improve inference efficiency, we devise a shortcut tunnel to bypass the rehearsed learning process at a meta-testing time. Extensive experiments on two real-world application domains: few-shot image classification and cold-start recommendation demonstrate the superiority of CTML compared to state-of-the-art methods. We provide our code at https://github.com/didiya0825.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call