The performance of state-of-the-art stroke gesture recognizers has been extensively tested against accuracy and speed in the user-dependent or user-independent procedure on a wide variety of gesture sets. These tests were carried out under heterogeneous experimental conditions, making the results non-comparable from one source to another, and calling for a consistent assessment that ensures reproducibility of the results. When a new gesture set should be incorporated in a gesture-based user interface and/or when a new recognizer is made available, this gesture set needs to be tested with existing and new recognizers to determine which recognizer offers the best performance in a given context of use. To this end, this technical note presents Gester, a software that automates the performance testing of stroke gesture recognizers in terms of accuracy (through three recognition rates) and speed (through four execution times). It exports data for visualizing confusion matrices and confusion wheels. We elaborate on typical use cases supported by Gester, which include the two traditional testing procedures, i.e., user-dependent and user-independent scenarios, and provide two new ones, i.e., dataset-dependent and dataset-independent, to test the performance of recognizers under new conditions. Then, we illustrate a series of activities.

Magrofuoco, N., Vanderdonckt, J., Roselli, P. (2025). Performance Testing of Stroke Gesture Recognizers with Gester. PROCEEDINGS OF THE ACM ON HUMAN-COMPUTER INTERACTION, 9(4), 1-27 [10.1145/3734186].

Performance Testing of Stroke Gesture Recognizers with Gester

Paolo Roselli
2025-01-01

Abstract

The performance of state-of-the-art stroke gesture recognizers has been extensively tested against accuracy and speed in the user-dependent or user-independent procedure on a wide variety of gesture sets. These tests were carried out under heterogeneous experimental conditions, making the results non-comparable from one source to another, and calling for a consistent assessment that ensures reproducibility of the results. When a new gesture set should be incorporated in a gesture-based user interface and/or when a new recognizer is made available, this gesture set needs to be tested with existing and new recognizers to determine which recognizer offers the best performance in a given context of use. To this end, this technical note presents Gester, a software that automates the performance testing of stroke gesture recognizers in terms of accuracy (through three recognition rates) and speed (through four execution times). It exports data for visualizing confusion matrices and confusion wheels. We elaborate on typical use cases supported by Gester, which include the two traditional testing procedures, i.e., user-dependent and user-independent scenarios, and provide two new ones, i.e., dataset-dependent and dataset-independent, to test the performance of recognizers under new conditions. Then, we illustrate a series of activities.
2025
Pubblicato
Rilevanza internazionale
Articolo
Esperti anonimi
Settore MAT/01
Settore INFO-01/A - Informatica
English
Senza Impact Factor ISI
Benchmarking
Gestural interaction
Gesture datasets
Gesture recognition
Gesture user interfaces
Performance testing
Magrofuoco, N., Vanderdonckt, J., Roselli, P. (2025). Performance Testing of Stroke Gesture Recognizers with Gester. PROCEEDINGS OF THE ACM ON HUMAN-COMPUTER INTERACTION, 9(4), 1-27 [10.1145/3734186].
Magrofuoco, N; Vanderdonckt, J; Roselli, P
Articolo su rivista
File in questo prodotto:
File Dimensione Formato  
Gester-EICS2025.pdf

solo utenti autorizzati

Tipologia: Versione Editoriale (PDF)
Licenza: Copyright dell'editore
Dimensione 1.68 MB
Formato Adobe PDF
1.68 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2108/448823
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? ND
social impact