A Gesture-based Recognition System for Augmented Reality

Vinothini Kasinathan (1), Aida Mustapha (2), Asti Amalia Nur Fajrillah (3)
(1) Faculty of Computing, Engineering and Technology, Asia Pacific University of Technology and Innovation, Technology Park Malaysia, 57000 Kuala Lumpur, Malaysia.
(2) Faculty of Computer Science and Information Technology, Universiti Tun Hussein Onn Malaysia, 86400 Parit Raja, Johor, Malaysia.
(3) School of Industrial Engineering, Telkom University, 40257 Bandung, West Java, Indonesia
Fulltext View | Download
How to cite (IJASEIT) :
Kasinathan, Vinothini, et al. “A Gesture-Based Recognition System for Augmented Reality”. International Journal on Advanced Science, Engineering and Information Technology, vol. 9, no. 6, Dec. 2019, pp. 2182-9, doi:10.18517/ijaseit.9.6.6267.
With the geometrical improvement in Information Technology, current conventional input devices are becoming increasingly obsolete and lacking. Experts in Human Computer Interaction (HCI) are convinced that input devices remain the bottleneck of information acquisition specifically in when using Augmented Reality (AR) technology. Current input mechanisms are unable to compete with this trend towards naturalness and expressivity which allows users to perform natural gestures or operations and convert them as input. Hence, a more natural and intuitive input device is imperative, specifically gestural inputs that have been widely perceived by HCI experts as the next big input device. To address this gap, this project is set to develop a prototype of hand gesture recognition system based on computer vision in modeling basic human-computer interactions. The main motivation in this work is a technology that requires no outfitting of additional equipment whatsoever by the users. The gesture-based had recognition system was implemented using the Rapid Application Development (RAD) methodology and was evaluated in terms of its usability and performance through five levels of testing, which are unit testing, integration testing, system testing, recognition accuracy testing, and user acceptance testing. The test results of unit, integration, system testing as well as user acceptance testing produced favorable results. In conclusion, current conventional input devices will continue to bottleneck this advancement in technology; therefore, a better alternative input technique should be looked into, in particularly, gesture-based input technique which offers user a more natural and intuitive control.

Nicu Sebe, Michael S Lew, and Thomas S Huang. The state-of-the-art in human-computer interaction. In International Workshop on Computer Vision in Human-Computer Interaction, pp. 1-6, 2004.

Don Norman. The design of everyday things: Revised and expanded edition. Basic Books, 2013.

Edwin L Hutchins, James D Hollan, and Donald A Norman. Direct manipulation interfaces. Human-Computer Interaction, 1(4):311-338, 1985.

Robert JK Jacob. Human-computer interaction: input devices. ACM Computing Surveys (CSUR), 28(1):177-179, 1996.

On the horizon: A new remote catheter manipulation system, 2018.

NP Sheehy and AJ Chapman. Nonverbal behavior at the human-computer interface. International Reviews of Ergonomics, 1:159-172, 1987.

Alexander I Rudnicky, Alexander G Hauptmann, and Kai-Fu Lee. Survey of current speech technology. Communications of the ACM, 37(3):52-57, 1994.

Russell Beale and ADN Edwards. Gestures and neural networks in human computer interaction. In Neural Nets in Human-Computer Interaction, IEE Colloquium on, pp. 5-1. IET, 2018.

D Ward. Dasher with an eye-tracker, 2015.

Clare-Marie Karat, Christine Halverson, Daniel Horn, and John Karat. Patterns of entry and correction in large vocabulary continuous speech recognition systems. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems, pp. 568-575. ACM, 1999.

I Scott MacKenzie, Abigail Sellen, and William A S Buxton. A comparison nof input devices in element pointing and dragging tasks. In Proceedings of the SIGCHI conference on Human factors in computing systems, pp.161-166. ACM, 1991.

Alexander G Hauptmann. Speech and gestures for graphic image manipulation. In ACM SIGCHI Bulletin, volume 20, pp. 241-245. ACM, 1989.

Dean Rubine. Specifying gestures by example, volume 25. ACM, 1991.

Ramesh M Kagalkar and SV Gumaste. Detail study for sign language recognization techniques. Digital Image Processing, 8(3):65-69, 2016.

Kouichi Murakami and Hitomi Taguchi. Gesture recognition using recurrent neural networks. In Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 237-242, 1991.

Lam T Phi, Hung D Nguyen, TT Quyen Bui, and Thang T Vu. A glove based gesture recognition system for Z Vietnamese sign language. In Control, Automation and Systems (ICCAS), 15th International Conference on, pp. 1555-1559, 2015.

HS Nagendraswamy, BM Chethana Kumara, and R Lekha Chinmayi. Gist descriptors for sign language recognition: an approach based on symbolic representation. In International Conference on Mining Intelligence and Knowledge Exploration, pp. 103-114. Springer, 2015.

Sidney Fels and Geoffrey E Hinton. Building adaptive interfaces with neural networks: The glove-talk pilot study. In Proceedings of the IFIPTC13 Third International Conference on Human-Computer Interaction, pp. 683-688, 1990.

Philip J Mercurio, Thomas Erickson, D Diaper, D Gilmore, G Cockton, and B Shackel. Interactive scientiï¬c visualization: An assessment of a virtual reality system. In INTERACT, pp. 741-745, 1990.

Thomas G Zimmerman, Jaron Lanier, Chuck Blanchard, Steve Bryson, and Young Harvill. A hand gesture interface device. In ACM SIGCHI Bulletin, volume 18, pp. 189-192. ACM, 1987.

Andrew D Wilson and Edward Cutrell. Flowmouse: A computer vision-based pointing and gesture input device. In INTERACT, vol. 5, pp.565-578, 2005.

Randy Pausch. Virtual reality on ï¬ve dollars a day. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 265-270, 1991.

Eric J Townsend. Mattel Power Glove FAQ, 1993.

Jane J Stephan and Sana’a Khudayer. Gesture recognition for human computer interaction (HCI). Int. J. Adv. Comp. Techn., 2(4):30-35, 2010.

Zhang Lei, Zhang Xue-fei, and Liu Yin-ping. Research of the real-time detection of trafï¬c flow based on open CV. In Computer Science and Software Engineering, International Conference on, volume 2, pp. 870-873. IEEE, 2008.

James Kerr and Richard Hunter. Inside RAD: How to build fully functional computer systems in 90 days or less. McGraw-Hill, Inc., 1994.

Li, C., Xie, C., Zhang, B., Chen, C. and Han, J., 2018. Deep Fisher discriminant learning for mobile hand gesture recognition. Pattern Recognition, 77, pp.276-288.

Wu, Y., Jiang, D., Duan, J., Liu, X., Bayford, R. and Demosthenous, A., 2018, May. Towards a High Accuracy Wearable Hand Gesture Recognition System Using EIT. In Circuits and Systems (ISCAS), 2018 IEEE International Symposium on (pp. 1-4). IEEE.

Níºñez, J.C., Cabido, R., Pantrigo, J.J., Montemayor, A.S. and Ví©lez, J.F., 2018. Convolutional Neural Networks and Long Short-Term Memory for skeleton-based human activity and hand gesture recognition. Pattern Recognition, 76, pp.80-94.

Patel, N.A. and Patel, S.J., 2018. A Survey On Hand Gesture Recognition System For Human Computer Interaction (HCI).

Antoshchuk, S., Kovalenko, M. and Sieck, J., 2018. Gesture Recognition-Based Human-Computer Interaction Interface for Multimedia Applications. In Digitisation of Culture: Namibian and International Perspectives (pp. 269-286). Springer, Singapore.

Rafii, A., Gokturk, S.B., Tomasi, C. and Sí¼rí¼cí¼, F., Microsoft Technology Licensing LLC, 2018. Gesture recognition system using depth perceptive sensors. U.S. Patent 9,959,463.

Authors who publish with this journal agree to the following terms:

    1. Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
    2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
    3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).