Abstract
For the requirements of multi-modal man–machine interaction between astronauts and robots in lunar surface exploration, research on three interaction modes and test system based on speech, body posture and tactile recognition are carried out. An improved speech feature classification and recognition algorithm based on Mel-scale Frequency Cepstral Coefficients (MFCC), a three-stage body posture recognition algorithm based on target detection—node recognition—skeleton action recognition, and a tactile recognition algorithm based on Radial Basis Function network (RBF) are proposed. In the body posture recognition algorithm, a lightweight CVC-Net model is developed based on the classical HourglassNet algorithm. Deep separable convolution, attention mechanism and improved up-sampling method are introduced. A lightweight network and adaptive graph convolution module for skeleton action recognition are designed by introducing the information of human body joints and skeleton vectors. Above algorithm is suitable for the embedded computing platform and can greatly reduce the number of module parameters while ensuring the detection accuracy. Based on the simulated lunar surface environment, lunar robot and simulated space suit, the ground test verification of three modes of command recognition is realized, which can meet the low power consumption, lightweight and real-time requirements of lunar robot for recognition algorithms and the embedded computing platform.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.