Description:
Discriminative based model have demonstrated an epic distinction in hand pose estimation. However there are key challenges to be solved on how to intergrate the self-similar parts of fingers which often occlude each other and how to reduce descrepancy among synthetic and realistic data for an accurate estimation. To handle occlusion which lead to inaccurate estimation, this paper presents a probabilistic model for finger position detection framework. In this framework the visibility correlation among fingers aid in predicting the occluded part between fingers thereby estimating hand pose accurately. Unlike convectional occlusion handling approach which assumes occluded parts of fingers as independent detection target, this paper presents a discriminative deep model which learns the visibility relationship among the occluded parts of fingers at multiple layers. In addition, we propose the semi-supervised Transductive Regression(STR) forest for classification and regression to minimise discrepancy among realistic and synthetic pose data. Experimental results demonstrate promising performance with respect to occlusion handling, and discrepancy reduction with higher degree of accuracy over state-of-the-art approaches.