While end users can acquire full 3D gestures with many input devices, they often capture only 3D trajectories, which are 3D uni-path, uni-stroke single-point gestures performed in thin air. Such trajectories with their $(x,y,z)$ coordinates could be interpreted as three 2D stroke gestures projected on three planes,ie, $XY$, $YZ$, and $ZX$, thus making them admissible for established 2D stroke gesture recognizers. To investigate whether 3D trajectories could be effectively and efficiently recognized, four 2D stroke gesture recognizers, ie, $P, $P+, $Q, and Rubine, are extended to the third dimension: $$P^3$, $$P+^3$, $$Q^3$, and Rubine-Sheng, an extension of Rubine for 3D with more features. Two new variations are also introduced: $F for flexible cloud matching and FreeHandUni for uni-path recognition. Rubine3D, another extension of Rubine for 3D which projects the 3D gesture on three orthogonal planes, is also included. These seven recognizers are compared against three challenging datasets containing 3D trajectories, ie, SHREC2019 and 3DTCGS, in a user-independent scenario, and 3DMadLabSD with its four domains, in both user-dependent and user-independent scenarios, with varying number of templates and sampling. Individual recognition rates and execution times per dataset and aggregated ones on all datasets show a highly significant difference of $$P+^3$ over its competitors. The potential effects of the dataset, the number of templates, and the sampling are also studied.

Ousmer, M., Sluy¿ters, A., Magrofuoco, N., Roselli, P., Vanderdonckt, J. (2020). Recognizing 3D trajectories as 2D multi-stroke gestures. PROCEEDINGS OF THE ACM ON HUMAN-COMPUTER INTERACTION, 4(ISS), 1-21 [10.1145/3427326].

Recognizing 3D trajectories as 2D multi-stroke gestures

Roselli P.;
2020-01-01

Abstract

While end users can acquire full 3D gestures with many input devices, they often capture only 3D trajectories, which are 3D uni-path, uni-stroke single-point gestures performed in thin air. Such trajectories with their $(x,y,z)$ coordinates could be interpreted as three 2D stroke gestures projected on three planes,ie, $XY$, $YZ$, and $ZX$, thus making them admissible for established 2D stroke gesture recognizers. To investigate whether 3D trajectories could be effectively and efficiently recognized, four 2D stroke gesture recognizers, ie, $P, $P+, $Q, and Rubine, are extended to the third dimension: $$P^3$, $$P+^3$, $$Q^3$, and Rubine-Sheng, an extension of Rubine for 3D with more features. Two new variations are also introduced: $F for flexible cloud matching and FreeHandUni for uni-path recognition. Rubine3D, another extension of Rubine for 3D which projects the 3D gesture on three orthogonal planes, is also included. These seven recognizers are compared against three challenging datasets containing 3D trajectories, ie, SHREC2019 and 3DTCGS, in a user-independent scenario, and 3DMadLabSD with its four domains, in both user-dependent and user-independent scenarios, with varying number of templates and sampling. Individual recognition rates and execution times per dataset and aggregated ones on all datasets show a highly significant difference of $$P+^3$ over its competitors. The potential effects of the dataset, the number of templates, and the sampling are also studied.
2020
Pubblicato
Rilevanza internazionale
Articolo
Esperti anonimi
Settore MAT/04 - MATEMATICHE COMPLEMENTARI
English
Human-centered computing; Gestural input; Graphical user interfaces
Ousmer, M., Sluy¿ters, A., Magrofuoco, N., Roselli, P., Vanderdonckt, J. (2020). Recognizing 3D trajectories as 2D multi-stroke gestures. PROCEEDINGS OF THE ACM ON HUMAN-COMPUTER INTERACTION, 4(ISS), 1-21 [10.1145/3427326].
Ousmer, M; Sluy¿ters, A; Magrofuoco, N; Roselli, P; Vanderdonckt, J
Articolo su rivista
File in questo prodotto:
File Dimensione Formato  
2020-09-29-Trajectories-ISS2020.pdf

solo utenti autorizzati

Tipologia: Versione Editoriale (PDF)
Licenza: Copyright dell'editore
Dimensione 6.17 MB
Formato Adobe PDF
6.17 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2108/323746
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 4
  • ???jsp.display-item.citation.isi??? ND
social impact