Abstract

The paper considers the problem solution of classifying the type of physical activity of a person according to visual data. The authors propose using of deep neural networks to determine the type of activity. The recognizing human activity from video data or a single image systems are currently actively used in various areas of human activity. As the example we can take the system for monitoring the effectiveness of enterprise employees. So solving the problem of recognizing human actions from visual data is an actual task. The authors developed an algorithm for determining the physical activity type by visual data based on the DenseNet121 and MobileNetV2 models. Then the deep neural network model was built and hyperparameters were selected, because pre-trained networks did not provide the required accuracy of detecting the type of physical activity. The software implementation of the model is made in the IDLE environment in the Python programming language. Experimental studies performed on a specialized UCF50 dataset containing 50 different types of human actions confirm the effectiveness of using the proposed approach to solve the problem. Additionally, the representativeness of the test data set was increased with the help of video sequences obtained from YouTube.
 Purpose – development of an algorithm for determining a person’s physical activity based on visual data.
 Methodology: in the work the methods of computer vision, deep learning methods and object-oriented programming methods were used.
 Results: an algorithm for tracking a person’s physical activity based on visual data using deep learning technologies has been developed.
 Practical implications: the obtained results can be used in human activity monitoring systems, for example, in tracking criminal activity, in medical diagnostics, in tracking the activity of office employees, etc.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call