Abstract

Human dynamic behavior detection and classification based on video plays an important role in understanding human motion, and it has many interesting applications in crime watch, security surveillance monitoring, sports, and so on. This paper reports our initial research work on dynamic human behavior pattern detection and classification based on real-time video data. We focus on four types of detailed dynamic human behaviors: walking, standing, running, and sitting. A Convolutional Long Short-Term Networks (CLSTM) model is proposed and used for dynamic human behavior pattern detection and classification based on videos. This model combines the CNN model and LSTM to support learning, detection, and classification of dynamic human behavior patterns. AlexNet, one of CNN traditional architecture, is firstly used to learn visual representation of human behaviors based on time-based images. Later, the results will be used as inputs for the LSTM model to learn time-based sequence features to support detailed behavior pattern classification. A prototype of a Human Behavior Detection System is reported based on the CLSTM model, and some earlier case study results are presented.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call