Abstract

Currently the internet along with being a source of vast information is also transforming into a diary for individuals. Various apps like instagram, twitter, facebook etc allow people to post their thoughts, pictures and videos on both private and public subjects. These platforms offer users to post via multiple mediums such as video, audio, text, emoticons etc. Mining this data in the correct context to recognize the current emotion of user has many applications. As this data gives an insight into the users emotions, temperament, personality and motivation. It can be used for better advertisement targeting, product recommendations, product feedback, tackling cyberbullying etc. The main concern of analysing this data to determine the emotions of the user is that this data is unstructured. To correctly gauge the mood it is cardinal choose a suitable format as the data source. As a single format may not be capable of providing complete context by multimodal approach provides the full picture. We have attempted to discern the mood of a user by designing a system that uses text and image data to corroborate this. Public tweets of a user are scrapped from their timeline and self uploaded images of a user are first classified individual and then fused in a score based level to determine whether user is happy, sad or neutral.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call