Abstract
Evaluating people’s food intake is crucial for establishing the link between diet and disease. Deprivation of vital nutrients causes the body to deteriorate organs and increases the risk of severe diseases that manifest in maturity. Making healthy food choices is one of the best ways to avoid developing chronic diseases, such as diabetes, heart disease, stroke, and even certain types of cancer. Hence, this paper proposes a deep learning-based automated nutrition classification system (DL-ANCS) for predicting food ingredients and nutrition. The DL-ANCS with Internet of Things (IoT) sensors to quantify food nutrition and a smartphone app to compile ingredient nutritional information. It is possible to take pictures of food eaten using the camera that comes with most mobile phones. It is possible to automatically identify the food items for record keeping using image processing. The effectiveness of the proposed DL-ANCS relies on its ability to accurately classify food items in these photos utilizing meal prediction algorithms and DL-ANCS. This research introduces a novel approach to extracting texture information from food photos to show how these features improve the accuracy of a nutritional assessment system that runs on mobile phones. The proposed method improvesthe food texture ratio of 98.7%, effectiveness ratio of 99.2%, accuracy ratio of 95.89%, food ingredient predictionsand their nutritional compatibility ratio of 96.8%, and food component classifications ratio of 97.29%.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have