Abstract
ABSTRACT The main purpose of the research on map emotional semantics is to describe and express the emotional responses caused by people observing images through computer technology. Nowadays, map application scenarios tend to be diversified, and the increasing demand for emotional information of map users bring new challenges for cartography. However, the lack of evaluation of emotions in the traditional map drawing process makes it difficult for the resulting maps to reach emotional resonance with map users. The core of solving this problem is to quantify the emotional semantics of maps, it can help mapmakers to better understand map emotions and improve user satisfaction. This paper aims to perform the quantification of map emotional semantics by applying transfer learning methods and the efficient computational power of convolutional neural networks (CNN) to establish the correspondence between visual features and emotions. The main contributions of this paper are as follows: (1) a Map Sentiment Dataset containing five discrete emotion categories; (2) three different CNNs (VGG16, VGG19, and InceptionV3) are applied for map sentiment classification task and evaluated by accuracy performance; (3) six different parameter combinations to conduct experiments that would determine the best combination of learning rate and batch size; and (4) the analysis of visual variables that affect the sentiment of a map according to the chart and visualization results. The experimental results reveal that the proposed method has good accuracy performance (around 88%) and that the emotional semantics of maps have some general rules.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.