Abstract

A new technique for action clustering-based human action representation on the basis of optical flow analysis and random sample consensus (RANSAC) method is proposed in this paper. The apparent motion of the human subject with respect to the background is detected and localized by using optical flow analysis. The next task is to characterize the action through the frequent movement of the optical flow points or interest points at different regions of the moving subject. The RANSAC algorithm is used to filter out any unwanted interested points all around the scene and keep only those that are related to that particular subject’s motion. From the remaining salient key interest points, the area of the human body within the frame is estimated. The rectangular area surrounding the human body is then segmented both horizontally and vertically. Now, the percentage of change of interest points in each horizontal and vertical segments from frame to frame is estimated. Similar results are obtained for different persons performing the same action and the corresponding values are averaged for respective segments. The matrix constructed by this strategy is used as a feature vector for that particular action. Similar data are calculated for each block created at the intersections of the horizontal and vertical segments. In addition to these, the change in the position of the person along X- and Y-axes is accumulated for an action and included in the feature vectors. Afterward, for the purpose of recognition using the extracted feature vectors, a distance-based similarity measure and a support vector machine-based classifiers have been exploited. Several combination of the feature vectors is examined. From extensive experimentation upon benchmark motion databases, it is found that the proposed method offers not only a very high degree of accuracy but also computational savings.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.