Abstract
The understanding of human intent based on human motions remains a highly relevant and challenging research topic. The relationship of the sequence of human motions may be a possible solution to recognize human intention. The supervised multiple timescale recurrent neural network (supervised MTRNN) model is a useful tool for motion classification. In this paper, we propose a new model to understand human intention based on human motions in real-time through a deep structure including two supervised MTRNN models, which are based on understanding the meaning of a series of human motions. The 1st supervised MTRNN layer classifies motion labels while the 2nd supervised MTRNN layer in the deep dynamic neural structure identifies human intention using the results of the 1st supervised MTRNN. We also considered the action–perception cycle effect between the 1st and the 2nd supervised MTRNNs, in which the motion label perception and internal action (motion prediction) form a cycle to improve the motion classification and intent recognition performance. A group of tasks was designed around movements involving two objects in an attempt to detect different motions and intentions based on the proposed deep dynamic neural model. The experimental results showed the deep supervised MTRNN to be more robust and to outperform the single layer supervised MTRNN model for detecting human intention. The action–perception cycle was found to efficiently improve both motion classification and prediction, which is important for human intent recognition.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.