Abstract

ABSTRACT Automatic video-based human activity recognition has shown promise and is mostly used in video surveillance applications for diverse purposes. However, there are still substantial performance issues that make real-world implementation challenging. The main barrier to the accurate detection of human movement remains to the perspective problem, which results from the fact that video sequences are frequently shot from random camera angles. Therefore, the main focus of this study is on how human eyesight reflects in different camera views that are used to detect human activity or identify intruders. First, a gaussian mixture model that previously used the Minimal Spanning Tree for segmentation is used for pre-processing. Using the pixel-based Kruskal methodology and this method, the input data set’s minimal weight is precisely determined. Segmentation comes next. Independent discriminant features are extracted using transformation using Karhunen-Loeve expansion, which isolates human behavior based on the Person Correlation Coefficient. To effectively identify data, Deep Lens Classifier is also employed to look for any suspect human behavior. With an impressive 78.8774% accuracy, 28.6961% sensitivity, 98.50% specificity, 75.6734% precision, 48.781% recall, and 65.10% F-measure, this approach is unique in the field of detecting human activity. Finally, our proposed system’s MATLAB performance retrieves the accurate detection.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.