learning audio-visual associations is foundational to a number of real-world skills, such as reading acquisition or social communication. characterizing individual differences in such learning has therefore been of interest to researchers in the field. here, we present a novel audio-visual associative learning task designed to efficiently capture inter-individual differences in learning, with the added feature of using non-linguistic stimuli, so as to unconfound language and reading proficiency of the learner from their more domain-general learning capability. by fitting trial-by-trial performance in our novel learning task using simple-to-use statistical tools, we demon- strate the expected inter-individual variability in learning rate as well as high precision in its estimation. we further demonstrate that such measured learning rate is linked to working memory performance in Italian- speaking (N = 58) and french-speaking (N = 51) adults. finally, we investigate the extent to which learning rate in our task, which measures cross-modal audio-visual associations while mitigating familiarity confounds, predicts reading ability across participants with different linguistic backgrounds. the present work thus introduces a novel non-linguistic audio-visual associative learning task that can be used across languages. In doing so, it brings a new tool to researchers in the various domains that rely on multi- sensory integration from reading to social cognition or socio-emotional learning.

Pasqualotto, A., Cochrane, A., Bavelier, D., Altarelli, I. (2024). A novel task and methods to evaluate inter-individual variation in audio-visual associative learning. COGNITION, 242, 1-13.

A novel task and methods to evaluate inter-individual variation in audio-visual associative learning

I Altarelli
2024-01-01

Abstract

learning audio-visual associations is foundational to a number of real-world skills, such as reading acquisition or social communication. characterizing individual differences in such learning has therefore been of interest to researchers in the field. here, we present a novel audio-visual associative learning task designed to efficiently capture inter-individual differences in learning, with the added feature of using non-linguistic stimuli, so as to unconfound language and reading proficiency of the learner from their more domain-general learning capability. by fitting trial-by-trial performance in our novel learning task using simple-to-use statistical tools, we demon- strate the expected inter-individual variability in learning rate as well as high precision in its estimation. we further demonstrate that such measured learning rate is linked to working memory performance in Italian- speaking (N = 58) and french-speaking (N = 51) adults. finally, we investigate the extent to which learning rate in our task, which measures cross-modal audio-visual associations while mitigating familiarity confounds, predicts reading ability across participants with different linguistic backgrounds. the present work thus introduces a novel non-linguistic audio-visual associative learning task that can be used across languages. In doing so, it brings a new tool to researchers in the various domains that rely on multi- sensory integration from reading to social cognition or socio-emotional learning.
2024
Pubblicato
Rilevanza internazionale
Articolo
Esperti anonimi
Settore PSIC-01/A - Psicologia generale
English
Pasqualotto, A., Cochrane, A., Bavelier, D., Altarelli, I. (2024). A novel task and methods to evaluate inter-individual variation in audio-visual associative learning. COGNITION, 242, 1-13.
Pasqualotto, A; Cochrane, A; Bavelier, D; Altarelli, I
Articolo su rivista
File in questo prodotto:
File Dimensione Formato  
Altarelli_Cognition_2024.pdf

accesso aperto

Tipologia: Versione Editoriale (PDF)
Licenza: Creative commons
Dimensione 2.74 MB
Formato Adobe PDF
2.74 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2108/393951
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 1
social impact