when looking for an object in a complex visual scene, augmented reality (AR) can assist search with visual cues persistently pointing in the target's direction. the effectiveness of these visual cues can be reduced if they are placed at a different visual depth plane to the target they are indicating. to overcome this visual-depth problem, we test the effectiveness of adding simultaneous spatialized auditory cues that are fixed at the target's location. In an experiment we manipulated which cue(s) were available (visual-only vs. visual + auditory), and which disparity plane relative to the target the visual cue was displayed on. results show that participants were slower at finding targets when the visual cue was placed on a different disparity plane to the target. However, this slowdown in search performance could be substantially reduced with auditory cueing. these results demonstrate the importance of AR cross-modal cueing under conditions of visual uncertainty and show that designers should consider augmenting visual cues with auditory ones.

Binetti, N., Wu, L., Chen, S., Kruijff, E., Julier, S., Brumby, D.p. (2021). Using visual and auditory cues to locate out-of-view objects in head-mounted augmented reality. DISPLAYS, 69 [10.1016/j.displa.2021.102032].

Using visual and auditory cues to locate out-of-view objects in head-mounted augmented reality

Binetti, N.
;
Wu, L.;Chen, S.;
2021-01-01

Abstract

when looking for an object in a complex visual scene, augmented reality (AR) can assist search with visual cues persistently pointing in the target's direction. the effectiveness of these visual cues can be reduced if they are placed at a different visual depth plane to the target they are indicating. to overcome this visual-depth problem, we test the effectiveness of adding simultaneous spatialized auditory cues that are fixed at the target's location. In an experiment we manipulated which cue(s) were available (visual-only vs. visual + auditory), and which disparity plane relative to the target the visual cue was displayed on. results show that participants were slower at finding targets when the visual cue was placed on a different disparity plane to the target. However, this slowdown in search performance could be substantially reduced with auditory cueing. these results demonstrate the importance of AR cross-modal cueing under conditions of visual uncertainty and show that designers should consider augmenting visual cues with auditory ones.
2021
Pubblicato
Rilevanza internazionale
Articolo
Esperti anonimi
Settore M-PSI/01
English
Augmented Reality
Head-mounted Display
Out-of-view Objects
Visual Discrimination
Visual Cueing
Auditory Cueing
Binetti, N., Wu, L., Chen, S., Kruijff, E., Julier, S., Brumby, D.p. (2021). Using visual and auditory cues to locate out-of-view objects in head-mounted augmented reality. DISPLAYS, 69 [10.1016/j.displa.2021.102032].
Binetti, N; Wu, L; Chen, S; Kruijff, E; Julier, S; Brumby, Dp
Articolo su rivista
File in questo prodotto:
File Dimensione Formato  
Binetti et al 2021 - Using visual and auditory cues to locate out-of-view objects in head-mounted augmented reality.pdf

solo utenti autorizzati

Tipologia: Versione Editoriale (PDF)
Licenza: Copyright dell'editore
Dimensione 1.78 MB
Formato Adobe PDF
1.78 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2108/363683
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 18
  • ???jsp.display-item.citation.isi??? 12
social impact