Abstract
Food image classification, an interesting subdomain of Computer Vision (CV) technology, focuses on the automatic classification of food items represented through images. This technology has gained immense attention in recent years thanks to its widespread applications spanning dietary monitoring and nutrition studies to restaurant recommendation systems. By leveraging the developments in Deep-Learning (DL) techniques, especially the Convolutional Neural Network (CNN), food image classification has been developed as an effective process for interacting with and understanding the nuances of the culinary world. The deep CNN-based automated food image classification method is a technology that utilizes DL approaches, particularly CNNs, for the automatic categorization and classification of the images of distinct kinds of foods. The current research article develops a Bio-Inspired Spotted Hyena Optimizer with a Deep Convolutional Neural Network-based Automated Food Image Classification (SHODCNN-FIC) approach. The main objective of the SHODCNN-FIC method is to recognize and classify food images into distinct types. The presented SHODCNN-FIC technique exploits the DL model with a hyperparameter tuning approach for the classification of food images. To accomplish this objective, the SHODCNN-FIC method exploits the DCNN-based Xception model to derive the feature vectors. Furthermore, the SHODCNN-FIC technique uses the SHO algorithm for optimal hyperparameter selection of the Xception model. The SHODCNN-FIC technique uses the Extreme Learning Machine (ELM) model for the detection and classification of food images. A detailed set of experiments was conducted to demonstrate the better food image classification performance of the proposed SHODCNN-FIC technique. The wide range of simulation outcomes confirmed the superior performance of the SHODCNN-FIC method over other DL models.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.