Abstract
Standard analytical methods for fish freshness assessment are based on the measurement of chemical and physical attributes related to fish appearance, color, meat elasticity or texture, odor, and taste. These methods have plenty of disadvantages, such as being destructive, expensive, and time consuming. All these techniques require highly skilled operators. In the last decade, rapid advances in the development of novel techniques for evaluating food quality attributes have led to the development of non-invasive and non-destructive instrumental techniques, such as biosensors, e-sensors, and spectroscopic methods. The available scientific reports demonstrate that all these new techniques provide a great deal of information with only one test, making them suitable for on-line and/or at-line process control. Moreover, these techniques often require little or no sample preparation and allow sample destruction to be avoided.
Highlights
The review covers the statistical analyses utilized in most of the presented papers. These algorithms, mainly utilized for the electronic and spectroscopic techniques, have a great and increasing importance in the sensors field, because they allow experimental data to be processed to create qualitative and quantitative models, enabling a fast and non-invasive measurement process. Almost all these algorithms are based on principal component analysis (PCA): it changes the original data space to a new space defined by the directions that contain the majority of data variance, called principal components (PCs)
The system was tested with crucian carp samples and the main results evidenced a linear relation between the measured current and the xanthine concentration, with a correlation coefficient of 0.99 and a limit of detection (LOD) of 20 nM
400–1000 nm) with a stacked denoising autoencoder neural network (SDAE-NN) to predict the cold storage time of salmon, obtaining an R2 of 0.98 in prediction and a root mean square error in prediction (RMSEP) of 0.93 days whereas Agyekum et al [97] studied the potentiality of a genetic algorithm to quantify volatile TMA concentrations in silver carps, using spectra acquired with an FT-NIR spectrometer and an optical fiber, resulting in an R2 p of 0.980 (RMSEP = 5.1 mgN/100 g)
Summary
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. The review covers the statistical analyses utilized in most of the presented papers These algorithms, mainly utilized for the electronic and spectroscopic techniques, have a great and increasing importance in the sensors field, because they allow experimental data to be processed to create qualitative and quantitative models, enabling a fast and non-invasive measurement process. Almost all these algorithms are based on principal component analysis (PCA): it changes the original data space (defined by the measured variables) to a new space defined by the directions that contain the majority of data variance, called principal components (PCs). Statistical algorithms used for the creation of a calibration model (on the right)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.