AIM The understanding of surgical gesture, by means of a measuring apparatus, can play a key role in the evaluation of surgical performance. To this aim, a neural network classification algorithm can be helpful, since it combines good generalization performances along with a parsimonious architecture when dealing with high dimensional classification problems. We present its use as a surgical training tool for both laparoscopic and open surgery, a field of research highly underrepresented in the surgical teaching scenario. We operated a bounding box decomposition of surgeon’s hand movements analysis and gesture recognition during training of novice surgeons. This feature was applied to analyze trajectories of surgeon’s wrist and finger postures, so to recognize different hand gestures. METHODS Dataset of surgical gestures: 5 master surgeons, 5 resident surgeons and 5 attending surgeons made this tasks: interrupted stitch; running suture; knot tying exercise. Gesture measurement: we developed a data glove on the basis of acquired experiences. This glove is provided with sensors to measure movements of distal interphalangeal, proximal interphalangeal, metacarpo phalangeal, finger joints and wrist postures. Gesture classification: synthesis of an algorithm automatically assigns each gesture to a predefined class. RESULTS Operator’s training: Currently, mentors transfer their expertise to trainee via practical demonstrations and oral instructions. With recorded data of measures it is possible to reproduce such movements via avatar representation on a PC screen. It gets the important aspect that the same gesture can be represented several times always in the same manner and that it is possible to look at the gesture from all possible points of view, just rotating, translating, zooming the avatar. We developed a graphical interface capable to superimpose a “ghost” avatar of the learner upon the “guide” avatar of the expert. In this manner the trainee is capable to easily auto-evaluate her/his performance with instinctive ability. CONCLUSIONS This work, still in progress, would be an innovative, accurate and non invasive method to measure and evaluate surgical gestures. It will be useful to accelerate the in-training surgeon’s learning curve who can compare the basic level of his expertise with master surgeon’s level and verify step by step his improvement.

Lazzaro, A., Corona, A., Sbernini, L., Santosuosso, G.l., Giannini, F., Pinto, C., et al. (2012). A new glove for gesture recognition and classification for surgical skill assesment. In 20th International Congress of the European Association for Endoscopic Surgery (EAES) Brussels, Belgium, 20–23 June 2012. Springer Verlag [10.1007/s00464-013-2876-9].

A new glove for gesture recognition and classification for surgical skill assesment

SANTOSUOSSO, GIOVANNI LUCA;GIANNINI, FRANCO;SILERI, PIERPAOLO;SAGGIO, GIOVANNI;DI LORENZO, NICOLA;
2012-01-01

Abstract

AIM The understanding of surgical gesture, by means of a measuring apparatus, can play a key role in the evaluation of surgical performance. To this aim, a neural network classification algorithm can be helpful, since it combines good generalization performances along with a parsimonious architecture when dealing with high dimensional classification problems. We present its use as a surgical training tool for both laparoscopic and open surgery, a field of research highly underrepresented in the surgical teaching scenario. We operated a bounding box decomposition of surgeon’s hand movements analysis and gesture recognition during training of novice surgeons. This feature was applied to analyze trajectories of surgeon’s wrist and finger postures, so to recognize different hand gestures. METHODS Dataset of surgical gestures: 5 master surgeons, 5 resident surgeons and 5 attending surgeons made this tasks: interrupted stitch; running suture; knot tying exercise. Gesture measurement: we developed a data glove on the basis of acquired experiences. This glove is provided with sensors to measure movements of distal interphalangeal, proximal interphalangeal, metacarpo phalangeal, finger joints and wrist postures. Gesture classification: synthesis of an algorithm automatically assigns each gesture to a predefined class. RESULTS Operator’s training: Currently, mentors transfer their expertise to trainee via practical demonstrations and oral instructions. With recorded data of measures it is possible to reproduce such movements via avatar representation on a PC screen. It gets the important aspect that the same gesture can be represented several times always in the same manner and that it is possible to look at the gesture from all possible points of view, just rotating, translating, zooming the avatar. We developed a graphical interface capable to superimpose a “ghost” avatar of the learner upon the “guide” avatar of the expert. In this manner the trainee is capable to easily auto-evaluate her/his performance with instinctive ability. CONCLUSIONS This work, still in progress, would be an innovative, accurate and non invasive method to measure and evaluate surgical gestures. It will be useful to accelerate the in-training surgeon’s learning curve who can compare the basic level of his expertise with master surgeon’s level and verify step by step his improvement.
20th International EAES Congress - European Association for Endoscopic Surgery
Brussels, Belgium
2012
Rilevanza internazionale
contributo
giu-2012
2012
Settore ING-INF/01 - ELETTRONICA
English
Intervento a convegno
Lazzaro, A., Corona, A., Sbernini, L., Santosuosso, G.l., Giannini, F., Pinto, C., et al. (2012). A new glove for gesture recognition and classification for surgical skill assesment. In 20th International Congress of the European Association for Endoscopic Surgery (EAES) Brussels, Belgium, 20–23 June 2012. Springer Verlag [10.1007/s00464-013-2876-9].
Lazzaro, A; Corona, A; Sbernini, L; Santosuosso, Gl; Giannini, F; Pinto, C; Iezzi, L; Sileri, P; Saggio, G; DI LORENZO, N; Gaspari, A
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2108/98290
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact