Abstract
The challenges of the aging population is becoming more and more prominent worldwide. Among them, in the face of the elderly fall phenomenon, human fall detection technology research and development has practical application value. Because of a large number of network parameters in the field of fall detection and the limited computing power of embedded devices, which makes it difficult to run on the embedded platform, this paper proposes an OT-YOLOV3 (OpenCV+Tiny-YOLOV3) fall detection method. In this method, Gaussian processing and other operations are used to preprocess the fallen image to avoid the influence of the angle change of the image on the recognition result. Then, the feature extraction network in Tiny-YOLOV3 was replaced by the MobileNet network to increase the number of network layers and reduce the number of parameters and calculations in the model. At the same time, the multi-scale prediction method was used to improve detection accuracy. Experimental results show that the accuracy of the proposed model is 10% higher than that of the YOLOV3 (You Only Look Once Version three) model, 4% higher than that of the Tiny-YOLOV3 model, 3% higher than that of the YOLOV3 model, 3% higher than that of Tiny-YOLOV3 model, and the model size is only 45% of that of YOLOV3 model and 65% of Tiny-YOLOV3. Compared with YOLOV3 and Tiny-YOLOV3 processing methods, the drop recognition effect is significantly improved and the model memory is reduced, which meets the requirements of real-time and efficient detection for embedded devices.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.