Abstract

Existing few-shot learning (FSL) methods usually treat each sample as a single feature point or utilize intra-class feature transformation to augment features. However, few-shot novel features are always vulnerable to noise, intra-class features have large variance and the direction of intra-class feature transformations is uncontrollable, which result in degradation of FSL models. Besides, existing FSL methods are one-generation based which do not utilize the prior knowledge obtained in the prior generation model to generate more robust posterior model in FSL and lack the interpretability in FSL. In this paper, we propose a novel two-generation based Latent Feature Augmentation and Distribution Regularization framework (LFADR) including prior relation net (PRN) and vae-based posterior relation net (VPORN) to generate a more robust VPORN based on PRN by transferring the prior knowledge in FSL. Firstly, we utilize a simple single-original-feature-point-based PRN to generate the more informative prior knowledge. We then propose a regularized-distribution-based VPORN driven by VAE to augment latent features by sampling from regularized class-specific distribution based on the prior knowledge transferred from PRN in order to guarantee and control the diversity of intra-class features which avoids uncontrollable feature transformations and reduces the variance of intra-class features and the impact of noise. As a result, LFADR can learn more key and robust intra-class and discriminative inter-class features and make decision boundary clearer in FSL, which is optimized as a variational inference problem. Furthermore, we analyse the feasibility and effectiveness of our framework based on Hoeffding’s inequality and Chernoff’s bounding method. Finally, experimental results validate our theoretical analysis and the effectiveness of our proposed FSL framework.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call