Abstract

Gustatory display research is still in its infancy despite being one of the essential everyday senses that human practice while eating and drinking. Indeed, the most important and frequent tasks that our brain deals with every day are foraging and feeding. The recent studies by psychologists and cognitive neuroscientist revealed how complex multisensory rely on the integration of cues from all the human senses in any flavor experiences. The perception of flavor is multisensory and involves combinations of gustatory and olfactory stimuli. The cross-modal mapping between these modalities needs to be more explored in the virtual environment and simulation, especially in liquid food. In this paper, we present a customized wearable Augmented Reality (AR) system and olfaction display to study the effect of vision and olfaction on the gustatory sense. A user experiment and extensive analysis conducted to study the influence of each stimulus on the overall flavor, including other factors like age, previous experience in Virtual Reality (VR)/AR, and beverage consumption. The result showed that smell contributes strongly to the flavor with less contribution to the vision. However, the combination of these stimuli can deliver richer experience and a higher belief rate. Beverage consumption had a significant effect on the flavor belief rate. Experience is correlated with stimulus and age is correlated with belief rate, and both indirectly affected the belief rate.

Highlights

  • People’s interaction with interfaces has mostly been limited to visual, and to a lesser extent, auditory inputs

  • The overall success rate is low if compared to 72.6% in [22] study, this study is different as it is done on liquid as aforementioned and not on solid food

  • According to [40] and based on Harrison et al study [15], diffusion of flavor compounds between lipid and aqueous phases is extremely rapid in liquid foods which will affect the release of flavor

Read more

Summary

Introduction

People’s interaction with interfaces has mostly been limited to visual, and to a lesser extent, auditory inputs. The Image can affect the taste perception [2, 20] Sometimes haptic interface is used to enable users to physically interact with the virtual environment (VE) through touching the 3D objects [11,12,13]. Adding olfactory and gustatory senses is still in its infancy This is due to the difficulty in dealing with these two senses as they are based on the chemical signal, visual, auditory, and haptic senses are physical signals. Combining AR technology with olfactory display has the potential in the food domain where it can improve people’s eating experience, especially for those who can eat limited kinds of foods due to their health problems [43]

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call