Abstract

Airborne pollen identification is crucial to help patients prevent pollinosis symptoms. Existing data-driven methods rely on large-scale pollen images with simple backgrounds. In real scenarios, the background is complex and the data scale is small. Therefore, these methods suffer from two challenges: (1) Irrelevant information interference; (2) Incomplete feature attention. To overcome these challenges, we propose a prior knowledge-guided deep feature learning (PK-DFL) for real-world optical microscope image classification. Its main steps are as follows: Pollen location is designed to locate pollen grains based on color features, aiming to boost the accuracy of shape and texture prior feature extraction. Shape-texture awareness helps to extract the shape and texture of pollen grains via predefined feature extractors (i.e., a set of shape descriptors and an improved SFTA). These features are used to construct two types of prior knowledge, namely shape-texture attention maps (STA maps) and shape-texture feature vectors (STF vectors). Pollen classification uses a deep network (CNN) to classify pollen via imitating the pollen identification procedure of palynologists. It uses STA maps to weight pollen images and convolutional feature maps for instructing the CNN to focus on critical areas of pollen images (for the first challenge). STF vectors are employed to obtain the inter-class similarity of pollen via template matching. This information is further converted to soft targets that are used to supervise the CNN attending to comprehensive key features (for the second challenge). Extensive experiments on real-world datasets demonstrate the effectiveness of our PK-DFL (with accuracy and F1-score over 88%).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call