Abstract

Pedestrian identification and tracking is a crucial duty in smart building monitoring. The development of sensors has led to architects' focus on smart building design. The image distortions caused by numerous external environmental factors present a significant problem for pedestrian recognition in smart buildings. It is difficult for machine learning algorithms and other conventional filter-based image classification methods, such as histograms of oriented gradient filters, to function efficiently when dealing with many input photos of pedestrians. Deep learning algorithms are now performing substantially better when processing an enormous amount of image data. This article evaluates a novel multimodal classifier-based pedestrian identification method. The proposed method is Multimodal Faster RCNN Inception and ResNet V2 (MM Fast RCNN ResNet). The collected attributes address a tracking problem and establish the foundation for several object recognition tasks (novelty). Our method's neural network is regularized, and the feature representation is automatically adjusted to the detection assignment, resulting in high accuracy (superior to the proposed method). The proposed method is assessed using the PenFudan dataset and contemporary techniques regarding several factors. It is discovered that the recommended MM Fast RCNN ResNet obtains precision, recall, FPPI, FPPW, and average precision of 0.9057, 0.8629, 0.0898, and 0.0943.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.