Abstract

There has been a rapid increase in dietary ailments during the last few decades, caused by unhealthy food routine. Mobile-based dietary assessment systems that can record real-time images of the meal and analyze it for nutritional content can be very handy and improve the dietary habits and, therefore, result in a healthy life. This paper proposes a novel system to automatically estimate food attributes such as ingredients and nutritional value by classifying the input image of food. Our method employs different deep learning models for accurate food identification. In addition to image analysis, attributes and ingredients are estimated by extracting semantically related words from a huge corpus of text, collected over the Internet. We performed experiments with a dataset comprising 100 classes, averaging 1000 images for each class to acquire top 1 classification rate of up to 85%. An extension of a benchmark dataset Food-101 is also created to include sub-continental foods. Results show that our proposed system is equally efficient on the basic Food-101 dataset and its extension for sub-continental foods. The proposed system is implemented as a mobile app that has its application in the healthcare sector.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call