Abstract

With the popularity of multimodal social media platforms such as Tiktok and Instagram, the data form of social media shows a multimodal trend, and multimodal emotion analysis has become an increasingly important research topic. Aiming at the problem that existing multimodal emotion analysis methods ignore the implicit Semantic information contained in images in the multimodal data representation layer, an emotion analysis model based on multimodal information association of multiple attention mechanisms is proposed. The experiment shows that the proposed model is not only superior to existing single modal sentiment analysis methods, but also superior to existing multimodal sentiment analysis methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call