Abstract
The Cyranose 320 electronic nose (Enose) and zNose™ are two instruments used to detect volatile profiles. In this research, feature level and decision level multisensor data fusion models, combined with covariance matrix adaptation evolutionary strategy (CMAES), were developed to fuse the Enose and zNose data to improve detection and classification performance for damaged apples compared with using the individual instruments alone. Principal component analysis (PCA) was used for feature extraction and probabilistic neural networks (PNN) were developed as the classifier. Three feature-based fusion schemes were compared. Dynamic selective fusion achieved an average 1.8% and a best 0% classification error rate in a total of 30 independent runs. The static selective fusion approach resulted in a 6.1% classification error rate, which was not as good as using individual sensors (4.2% for the Enose and 2.6% for the zNose) if only selected features were applied. Simply adding the Enose and zNose features without selection (non-selective fusion) worsened the classification performance with a 32.5% classification error rate. This indicated that the feature selection using the CMAES is an indispensable process in multisensor data fusion, especially if multiple sources of sensors contain much irrelevant or redundant information. At the decision level, Bayesian network fusion achieved better performance than two individual sensors, with 11% error rate versus 13% error rate for the Enose and 20% error rate for the zNose. It is shown that both the feature level fusion with the CMAES optimization algorithms and decision level fusion using a Bayesian network as a classifier improved system classification performance. This methodology can also be applied to other sensor fusion applications.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have