Abstract

Pollen can lead to individuals suffering from allergic rhinitis, with a person’s vulnerability being dependent on the species and the amount of pollen. Therefore, the ability to precisely quantify both the number and species of pollen grains in a certain volume would be invaluable. Lensless sensing offers the ability to classify pollen grains from their scattering patterns, with the use of very few optical components. However, since there could be 1000 s of species of pollen one may wish to identify, in order to avoid having to collect scattering patterns from all species (and mixtures of species) we propose using two separate neural networks. The first neural network generates a microscope equivalent image from the scattering pattern, having been trained on a limited number of experimentally collected pollen scattering data. The second neural network segments the generated image into its components, having been trained on microscope images, allowing pollen species identification (potentially allowing the use of existing databases of microscope images to expand range of species identified by the segmentation network). In addition to classification, segmentation also provides richer information, such as the number of pixels and therefore the potential size of particular pollen grains. Specifically, we demonstrate the identification and projected area of pollen grain species, via semantic image segmentation, in generated microscope images of pollen grains, containing mixtures and species that were previously unseen by the image generation network. The microscope images of mixtures of pollen grains, used for training the segmentation neural network, were created by fusing microscope images of isolated pollen grains together while the trained neural network was tested on microscope images of actual mixtures. The ability to carry out pollen species identification from reconstructed images without needing to train the identification network on the scattering patterns is useful for the real-world implementation of such technology.

Highlights

  • It is estimated that, depending on geographical location, 10% to 40% of the population in certain areas of Europe suffer from allergic rhinitis [1], and there is evidence that the susceptibility to different pollen species varies depending on an individual’s age [2]

  • Since existing databases contain 1000s of microscope images of pollen grains [33], in this work, we propose that the separation of this objective into two components alleviates this challenge

  • An image labelling neural network was trained using the microscope images from all pollen grain species, and once trained, the network was tested on images generated from experimental scattering patterns

Read more

Summary

Introduction

It is estimated that, depending on geographical location, 10% to 40% of the population in certain areas of Europe suffer from allergic rhinitis (hay fever) [1], and there is evidence that the susceptibility to different pollen species varies depending on an individual’s age [2]. Having a sensor that could identify the levels of pollen species at a specific location in real-time, so that an individual can either (a) determine the species that is causing them the most severe symptoms, or (b) mitigate the effects by avoiding the pollen, could aid in reducing effects Ac. AUTHOR SUBMITTED MANUSCRIPT - JPCO-101910.R2. Current techniques for real-time sensing of pollen particles is very limited, as optical particle counters only detect particles of an approximate size, and not the type of particle, while pollen collected via Burkard traps [9,10] requires laboratory examination to determine the pollen species [3,11]. Recently explored automated methods for pollen identification from traps using optical and laser fluorescence techniques have been developed [12,13], these devices can be relatively large, and so a sensor that can image a pollen grain with minimal optics, at lower costs and with a small footprint

Objectives
Methods
Results
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call