Abstract

In recent years, few-shot classification algorithms have been developing. But many few-shot classification algorithms are facing their bottlenecks. After our study and research of the Prototypical Network, we found that the prototype calculation method and loss function are relatively simple, which can be improved better. We present our few-shot classification algorithm in this paper, which is inspired by Prototypical Network and center loss function. Based on the Prototypical Network, we propose our Adaptive Weights Model(AWM) to give each sample a better weight parameter, so that we can get a more reasonable prototype. Based on this model, bad samples(Contain a lot of noise) will get a smaller weight, while good samples(Contain very little noise) will get a larger weight. Based on the center loss function, we propose our Adaptive Sample’s Distribution Model(ASDM), which enables us to optimize the distribution of samples. Then, we did a lot of experiments based on our model. The results show that the our model is effective. Few-shot learning is becoming more and more important in machine learning. However, most few-shot learning algorithms only rely on deep neural networks to process samples. Here, we offer a different idea to give more reasonable weights to the samples.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call