Abstract

Although hand pose estimation has achieved a great success in recent years, there are still challenges with RGB-based estimation tasks, the most significant of which is the absence of labeled training data. At present, the synthetic dataset has plenty of images with accurate annotation, but the difference from real-world datasets affects generalization. Therefore, a transfer learning strategy, which tries to transfer knowledge from a labeled source domain to an unlabeled target domain, is a frequent solution. Existing methods such as mean-teacher, Cyclegan, and MCD will train models with the help of some easily accessible domains such as synthetic data. However, these methods are not guaranteed to operate well in real-world settings due to the domain shift. In this paper, we design a new unsupervised domain adaptation method named Multi-branch Adversarial Regressors (MarsDA) in hand pose estimation, where it could be better for feature migration. Specifically, we first generate pseudo-labels for unlabeled target domain data. Then, the new adversarial training loss between multiple regression branches we designed for hand pose estimation is introduced to narrow the domain gap. In this way, our model can reduce the noise of pseudo labels caused by the domain gap and improve the accuracy of pseudo labels. We evaluate our method on two publicly available real-world datasets, H3D and STB. Experimental results show that our method outperforms existing methods by a large margin.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.