Algorithmic Efficiency of Stroke Gesture Recognizers: a Comparative Analysis

Ana Belén Erazo (1), Jorge Luis Pérez Medina (2)
(1) Intelligent and Interactive Systems Lab (SI2-Lab), Universidad de Las Américas (UDLA), Quito, 170504, Ecuador
(2) Intelligent and Interactive Systems Lab (SI2-Lab), Universidad de Las Américas (UDLA), Quito, 170504, Ecuador
Fulltext View | Download
How to cite (IJASEIT) :
Erazo, Ana Belén, and Jorge Luis Pérez Medina. “Algorithmic Efficiency of Stroke Gesture Recognizers: A Comparative Analysis”. International Journal on Advanced Science, Engineering and Information Technology, vol. 10, no. 2, Mar. 2020, pp. 438-46, doi:10.18517/ijaseit.10.2.10807.
Gesture interaction is today recognized as a natural, intuitive way to execute commands of an interactive system. For this purpose, several stroke gesture recognizers become more efficient in recognizing end-user gestures from a training set. Although the rate algorithms propose their rates of return there is a deficiency in knowing which is the most recommended algorithm for its use. In the same way, the experiments known by the most successful algorithms have been carried out under different conditions, resulting in non-comparable results. To better understand their respective algorithmic efficiency, this paper compares the recognition rate, the error rate, and the recognition time of five reference stroke gesture recognition algorithms, i.e., $1, $P, $Q, !FTL, and Penny Pincher, on three diverse gesture sets, i.e., NicIcon, HHReco, and Utopiano Alphabet, in a user-independent scenario. Similar conditions were applied to all algorithms, to be executed under the same characteristics. For the algorithms studied, the method agreed to evaluate the error rate and performance rate, as well as the execution time of each of these algorithms. A software testing environment was developed in JavaScript to perform the comparative analysis. The results of this analysis help recommending a recognizer where it turns out to be the most efficient. !FTL (NLSD) is the best recognition rate and the most efficient algorithm for the HHreco and NicIcon datasets. However, Penny Pincher was the faster algorithm for HHreco datasets. Finally, $1 obtained the best recognition rate for the Utopiano Alphabet dataset.

Saffer, D.: Designing Gestural Interfaces: Touchscreens and Interactive Devices. O’Reilly Media, Inc. (2009)

Zhai, S., Kristensson, P., Appert, C., Andersen, T., Cao, X.: Foundational issues in touch-surface stroke gesture design: An integrative review. Found. Trends Human-Computer Interaction. 5(2), 97-205 (Feb 2012).

Schipor, O.A., Vatavu, R.D., Vanderdonckt, J.: Euphoria: A scalable, event-driven architecture for designing interactions across heterogeneous devices in smart environments. Information and Software Technology 109, 43-59 (2019).

Simarro, F.M., Lopez-Jaquero, V., Vanderdonckt, J., Gonzí¡lez, P., Lozano, M.D., Limbourg, Q.: Solving the mapping problem in user interface design by seamless integration in idealxml. In: Gilroy, S.W., Harrison, M.D. (eds.) Interactive Systems, Design, Specification, and Verification, 12th International Workshop, DSVIS 2005, Newcastle upon Tyne, UK, July 13-15, 2005, Revised Papers. Lecture Notes in Computer Science, vol. 3941, pp. 161-172. Springer (2005).

Wobbrock, J.O., Aung, H.H., Rothrock, B., Myers, B.A.: Maximizing the guessability of symbolic input. In: CHI’05 Extended Abstracts on Human Factors in Computing Systems. pp. 1869-1872. CHI EA’05, ACM, New York, NY, USA (2005).

Gheran, B.F., Vanderdonckt, J., Vatavu, R.D.: Gestures for smart rings: Empirical results, insights, and design implications. In: Proceedings of the 2018 Designing Interactive Systems Conference. pp. 623-635. DIS’18, ACM, New York, NY, USA (2018).

Wobbrock, J.O., Wilson, A.D., Li, Y.: Gestures without libraries, toolkits or training: A $1 recognizer for user interface prototypes. In: Proceedings of the 20th Annual ACM Symposium on User Interface Software and Technology. pp. 159-168. UIST’07, ACM, New York, NY, USA (2007).

Vatavu, R.D., Anthony, L., Wobbrock, J.O.: Gestures as point clouds: A $p recognizer for user interface prototypes. In: Proceedings of the 14th ACM International Conference on Multimodal Interaction. pp. 273-280. ICMI ’12, ACM, New York, NY, USA (2012).

Vatavu, R.D., Anthony, L., Wobbrock, J.: $Q: A super-quick, articulation-invariant stroke-gesture recognizer for low-resource devices. In: Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services. pp. 623-635. MobileHCI’18, ACM, New York, NY, USA (2018).

Vanderdonckt, J., Roselli, P., Pí©rez-Medina, J.L.: FTL, an Articulation-Invariant Stroke Gesture Recognizer with Controllable Position, Scale, and Rotation Invariances. In: Proceedings of the 20th ACM International Conference on Multimodal Interaction. pp. 125-134. ICMI’18, ACM, New York, NY, USA (2018).

Taranta, II, E.M., LaViola, Jr., J.J.: Penny pincher: A blazing fast, highly accurate $-family recognizer. In: Proceedings of the 41st Graphics Interface Conference. pp. 195-202. GI’15, Canadian Information Processing Society, Toronto, Ont., Canada, Canada (2015),

Niels, R., Willems, D., Vuurpijl, L.: The NicIcon database of handwritten icons for crisis management. Nijmegen Institute for Cognition and Information Radboud University Nijmegen, Nijmegen, The Netherlands 2 (2008)

Hse, H., Newton, A.R.: Graphic Symbol Recognition Toolkit (HHreco) Tutorial. Department of Electrical Engineering and Computer Sciences, University of California at Berkeley (2003)

Hse, H., Newton, A.R.: Sketched symbol recognition using zernike moments. In: Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004. vol. 1, pp. 367-370. IEEE (2004)

Ager, S.: Omniglot: The Online Encyclopedia of Writing Systems & Languages. Simon Ager (1998)

Willems, D., Niels, R., van Gerven, M., Vuurpijl, L.: Iconic and multi-stroke gesture recognition. Pattern Recognition 42(12), 3303 - 3312 (2009).

Rubine, D.: Specifying gestures by example. In: Proceedings of the 18th Annual Conference on Computer Graphics and Interactive Techniques. pp. 329-337. SIGGRAPH’91, ACM, New York, NY, USA (1991).

Coyette, A., Schimke, S., Vanderdonckt, J., Vielhauer, C.: Trainable sketch recognizer for graphical user interface design. In: Baranauskas, C., Palanque, P., Abascal, J., Barbosa, S.D.J. (eds.) Human-Computer Interaction - INTERACT 2007. pp. 124-135. Springer Berlin Heidelberg, Berlin, Heidelberg (2007)

Signer, B., Kurmann, U., Norrie, M.: igesture: A general gesture recognition framework. In: Ninth International Conference on Document Analysis and Recognition (ICDAR 2007). vol. 2, pp. 954-958 (Sep 2007).

Vanderdonckt, J., Beirekdar, A.: Automated web evaluation by guideline review. Journal of Web Engineering 4(2), 102-117 (2005),

Beirekdar, A., Vanderdonckt, J., Noirhomme-Fraiture, M.: A framework and a language for usability automatic evaluation of web sites by static analysis of HTML source code. In: Kolski, C., Vanderdonckt, J. (eds.) Computer-Aided Design of User Interfaces III, Proceedings of the Fourth International Conference on Computer- Aided Design of User Interfaces, May, 15-17, 2002, Valenciennes, France. pp. 337-348. Kluwer (2002)

Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Authors who publish with this journal agree to the following terms:

    1. Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
    2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
    3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).