Abstract

Global aging is becoming more and more serious, and the nursing problems of the elderly will become very serious in the future. The article designs a control system with ATmega128 as the main controller based on the function of the multifunctional nursing robot. The article uses a convolutional neural network structure to estimate the position of 3D human joints. The article maps the joint coordinates of the colour map to the depth map based on the two camera parameters. At the same time, 15 joint heat maps are constructed with the joint depth map coordinates as the centre, and the joint heat map and the depth map are bound to the second-level neural network. The prediction of the position of the user's armpit is further completed by image processing technology. We compare this method with other attitude prediction methods to verify the advantages of this research method. The research background of this article is carried out in the context of global aging in the 21st century.

Highlights

  • In recent years, the problems of walking and movement of the elderly, the disabled, and others have received more attention from scientific researchers in various countries

  • Most research studies on human pose recognition focus on estimating 2D human joint coordinates from colour maps. e deep learning method based on large datasets has shown excellent results in detecting joint human points in colour images [1]

  • To ensure the accuracy of human body posture recognition of the transfer and transportation nursing robot, this paper uses the colour map human joint detection model as the first-level neural network to calculate the colour map human joint pixel coordinates. e article maps the joint coordinates of the colour map to the depth map based on the two camera parameters

Read more

Summary

Introduction

The problems of walking and movement of the elderly, the disabled, and others have received more attention from scientific researchers in various countries. E deep learning method based on large datasets has shown excellent results in detecting joint human points in colour images [1]. Most research studies on human pose recognition focus on estimating 2D human joint coordinates from colour maps. These algorithms cannot directly provide the position of the human body in the global coordinate system for the transfer and transportation care robot. To ensure the accuracy of human body posture recognition of the transfer and transportation nursing robot, this paper uses the colour map human joint detection model as the first-level neural network to calculate the colour map human joint pixel coordinates. Artificial intelligence is a technology developed to simulate, extend, and expand human intelligence

Transfer and Transportation Nursing Robot
Level 1 Convolutional Neural Network
15 Right foot
Recogntion algorithm
Estimation of Underarm Points
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.