Nguyen Duc Thao, Nguyen Viet Anh, Le Thanh Ha, Ngo Thi Duyen

Main Article Content

Abstract

With the development of virtual reality (VR) technology and its applications in many fields, creating simulated hands in the virtual environment is an e ective way to replace the controller as well as to enhance user experience in interactive processes. Therefore, hand tracking problem is gaining a lot of research attention, making an important contribution in recognizing hand postures as well as tracking hand motions for VR’s input or human machine interaction applications. In order to create a markerless real-time hand tracking system suitable for natural human machine interaction, we propose a new method that combines generative and discriminative methods to solve the hand tracking problem using a single RGBD camera. Our system removes the requirement of the user having to wear to color wrist band and robustifies the hand localization even in di cult tracking scenarios.


Keywords
Hand tracking, generative method, discriminative method, human performance capture


References
[1] Malik,  A.  Elhayek,  F.  Nunnari,  K.  Varanasi, Tamaddon, A. Heloir, D. Stricker, Deephps: End-to-end estimation of 3d hand pose and shape by learning from synthetic depth, CoRR abs/1808.09208, 2018.


URL http://arxiv.org/abs/1808.09208.


[2] Glauser,  S.  Wu,  D.  Panozzo,  O.  Hilliges, Sorkine-Hornung, Interactive hand pose estimation using a stretch-sensing soft glove, ACM Trans, Graph. 38(4) (2019) 1-15.
[3] Jiang, H. Xia, C. Guo, A model-based system for real-time articulated hand tracking using a simple data glove and a depth camera, Sensors 19 (2019) 4680. https://doi.org/10.3390/s19214680.
[4] Cao, G. Hidalgo, T. Simon, S. Wei, Y. Sheikh, Openpose: Realtime multi-person 2d pose estimation using part a nity fields, CoRR abs/1812.08008, 2018.
[5] Tagliasacchi, M. Schroder, A. Tkach, S. Bouaziz, M. Botsch, M. Pauly, Robust articulated-icp for real-time hand tracking, Computer Graphics Forum 34, 2015.
[6] Qian, X. Sun, Y. Wei, X. Tang, J. Sun, Realtime and robust hand tracking from depth, in: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2014.
[7] Tomasi, Petrov, Sastry, 3d tracking = classification + interpolation, in: Proceedings Ninth IEEE International Conference on Computer Vision 2 (2003) 1441-1448.
[8] Sharp, C. Keskin, D. Robertson, J. Taylor, J. Shotton, D. Kim, C. Rhemann, I. Leichter, A. Vinnikov, Y. Wei, et al., Accurate, robust, and flexible real-time hand tracking, in: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, 2015, pp. 3633-3642.
[9] Sridhar, F. Mueller, A. Oulasvirta, C. Theobalt, Fast and robust hand tracking using detection-guided optimization, in: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2015.
[10] Oikonomidis, N. Kyriazis, A.A. Argyros, Tracking the articulated motion of two strongly interacting hands, in: 2012 IEEE Conference on Computer Vision and Pattern Recognition, 2012, pp. 1862-1869.
[11] Melax, L. Keselman, S. Orsten, Dynamics based 3d skeletal hand tracking, CoRR abs/1705.07640, 2017.
[12] Wang, S. Paris, J. Popovic, 6d hands: Markerless hand tracking for computer aided design, 2011, pp. 549-558.


https://doi.org/10.1145/2047196.2047269.
[13]Tang, T. Yu, T. Kim, Real-time articulated hand pose estimation using semi-supervised transductive regression forests, in: 2013 IEEE International Conference on Computer Vision, 2013, pp. 3224-3231.
[14] Oberweger, P. Wohlhart, V. Lepetit, Generalized feedback loop for joint hand-object pose estimation, 2019, CoRR abs/1903.10883. URL http://arxiv.org/abs/1903.10883.
[15] Malik, A. Elhayek, F. Nunnari, K. Varanasi, K. Tamaddon, A. Heloir,´ D. Stricker, Deephps: End-to-end estimation of 3d hand pose and shape by learning from synthetic depth, 2018, pp. 110-119. https://doi.org/10.1109/3DV.2018.00023.
[16] A. Mohammed, J.L.M. Islam, A deep learning-based end-to-end composite system for hand detection and gesture recognition, Sensors 19 (2019) 5282.  https://doi.org/10.3390/s19235282.