Abstract
Visuospatial working memory is a fundamental cognitive capability of human beings needed for exploring the visual environment. This cognitive function is responsible for creating visuospatial maps, which are useful for maintaining a coherent and continuous representation of visual and spatial relationships among objects present in the external world. A bio-inspired computational model of Visuospatial Working Memory (VSWM) is proposed in this paper to endow Autonomous Unmanned Aerial Vehicles (UAVs) with this cognitive function. The VSWM model was implemented on a low-cost commercial drone. A total of 30 test cases were designed and executed. These test cases were grouped into three scenarios: (i) environments with static and dynamic vehicles, (ii) environments with people, and (iii) environments with people and vehicles. The visuospatial ability of the VSWM model was measured in terms of the ability to classify and locate objects in the environment. The VSWM model was capable of maintaining a coherent and continuous representation of visual and spatial relationships among interest objects presented in the environment even when a visual stimulus is lost because of a total occlusion. The VSWM model proposed in this paper represents a step towards autonomous UAVs capable of forming visuospatial mental imagery in realistic environments.
Highlights
Unmanned Aerial Vehicles (UAVs), known as drones, are becoming more and more popular in the research community because they have been widely studied in order to know their potential usage in different areas such as entertainment [1], marketing [2], healthcare [3], agriculture [4], and security [5]
Materials are available at CogniDron/Visuospatial Working Memory (VSWM)/videos) In order to estimate the bio-inspired computational model’s accuracy for generating spatial relationships among relevant stimuli in the environment, 100 frame samples were taken from the video of each test case
The selective removal process proposed in the bio-inspired VSWM computational model was useful for maintaining a coherent and continuous representation of visual and spatial relationships among interest objects presented in the environment
Summary
Unmanned Aerial Vehicles (UAVs), known as drones, are becoming more and more popular in the research community because they have been widely studied in order to know their potential usage in different areas such as entertainment [1], marketing [2], healthcare [3], agriculture [4], and security [5]. Creating an “intelligent” UAV capable of navigating and interacting autonomously in the real world involves facing formidable challenges because they need to be able to process multiple tasks simultaneously, such as constantly sensing the environment; identifying and classifying both static and dynamic obstacles and targets; generating an internal representation of the real world, where the spatial relationship information between the environment’s objects must be constantly updated; reasoning and making right decisions to react appropriately when unexpected events appear New interdisciplinary fields such as cognitive computing [7] and cognitive infocommunication [8] aim to create various bio-inspired engineering applications such as brain–computer interfaces [9] and computational systems capable of mimicking the intelligence of living beings (e.g., insects, rodents, primates, and humans) [10,11,12,13,14]. This paper presents a bio-inspired computational model of Visuospatial
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.