Abstract
Background: On the retrieval of spatiotemporal information of chorography (STIC), one of the most important topics is how to quickly pinpoint the desired STIC text out of the massive chorography databases. Domestically, there are not diverse means to retrieve the spatiotemporal information from chorography database. Emerging techniques like data mining, artificial intelligence (AI), and natural language processing (NLP) should be introduced into the informatization of chorography. Objective: This study intends to devise an information retrieval method for STIC based on deep learning, and fully demonstrates its feasibility. Methods: Firstly, the authors explained the flow for retrieving and analyzing the data features of STIC texts, and established a deep hash model for STIC texts. Next, the data matching flow was defined for STIC texts, the learned hash code was adopted as the memory address of STIC texts, and the hash Hamming distance of the text information was computed through linear search, thereby completing the task of STIC retrieval. Results: Our STIC text feature extraction model learned better STIC text features than the contrastive method. It learned many hash features, and differentiated between different information well, when there were many hash bits. Conclusion: In addition, our hash algorithm achieved the best retrieval accuracy among various methods. Finally, the hash features acquired by our algorithm can accelerate the retrieval speed of STIC texts. These experimental results demonstrate the effectiveness of the proposed model and algorithm.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Recent Advances in Computer Science and Communications
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.