Abstract

The development of social network has brought a large amount of image information, and the research on image emotion has gradually attracted wide attention. The current image emotion analysis methods based on multi-level features simply splice the features at each level and then classify the emotions, which not only ignores the correlation between features at different levels, but also ignores the synergistic effect between global features and local features. Therefore, this paper proposes an emotion model (MAML) based on mixed attention and multi-level dependence of images, which uses spatial and channel attention mechanisms to extract local emotion region features of images. Bi-directional Long Short Term Memory network (BiLSTM) is used to establish correlation between multi-level image global features. The experimental results of MAML model on artphoto and abstract data sets prove the validity of MAML model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call