Few-shot learning has become essential for generating neural network models that can generalize to novel classes given only a few labeled samples. Previous studies mostly focused only on building a class prototype via the relations between intra-class sample features and adopted the prototype to classify the target samples. Considering that the number of labeled samples is typically limited under few-shot settings, the use of these methods to produce representative prototypes for classification may not be efficient. To this end, in this study, we propose an attentive pooling network (APNet), which establishes the relationship between the prototype and target sample feature to highlight their important regions. APNet selectively assigns higher weights to the local features with higher relative importance scores in the prototype and the target feature map. By minimizing the classification loss through supervised learning, APNet learns to produce prototypes that are specific to the target feature based on the relative importance scores. To verify the effectiveness of APNet, we compared it with the existing methods on two popular few-shot learning datasets, and APNet outperforms the related methods by achieving 71.12% and 77.58% classification accuracy in the 5-shot setting on the miniImageNet and CUB datasets, respectively.