Abstract
Unattended object detection, recognition and tracking on unmanned reconnaissance platforms in battlefields and urban spaces are topics of emerging importance. In this paper, we present an unattended object recognition system that automatically detects objects of interest in videos and classifies them into various categories (e.g., person, car, truck, etc.). Our system is inspired by recent findings in visual neuroscience on feed-forward object detection and recognition pipeline and mirrors that via two main neuromorphic modules (1) A front-end detection module that combines form and motion based visual attention to search for and detect “integrated” object percepts as is hypothesized to occur in the human visual pathways; (2) A back-end recognition module that processes only the detected object percepts through a neuromorphic object classification algorithm based on multi-scale convolutional neural networks, which can be efficiently implemented in COTS hardware. Our neuromorphic system was evaluated using a variety of urban area video data collected from both stationary and moving platforms. The data are quite challenging as it includes targets at long ranges, occurring under variable conditions of illuminations and occlusion with high clutter. The experimental results of our system showed excellent detection and classification performance. In addition, the proposed bio-inspired approach is good for hardware implementation due to its low complexity and mapping to off-the-shelf conventional hardware.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.