Resting-state functional magnetic resonance imaging (rs-fMRI) has been successfully employed to understand the organisation of the human brain. Typically, the brain is parcellated into regions of interest (ROIs) and modelled as a graph where each ROI represents a node and association measures between ROI-specific blood-oxygen-level-dependent (BOLD) time series are edges. Recently, graph neural networks (GNNs) have seen a surge in popularity due to their success in modelling unstructured relational data. The latest developments with GNNs, however, have not yet been fully exploited for the analysis of rs-fMRI data, particularly with regards to its spatio-temporal dynamics. In this paper, we present a novel deep neural network architecture which combines both GNNs and temporal convolutional networks (TCNs) in order to learn from both the spatial and temporal components of rs-fMRI data in an end-to-end fashion. In particular, this corresponds to intra-feature learning (i.e., learning temporal dynamics with TCNs) as well as inter-feature learning (i.e., leveraging interactions between ROI-wise dynamics with GNNs). We evaluate our model with an ablation study using 35,159 samples from the UK Biobank rs-fMRI database, as well as in the smaller Human Connectome Project (HCP) dataset, both in a unimodal and in a multimodal fashion. We also demonstrate that out architecture contains explainability-related features which easily map to realistic neurobiological insights. We suggest that this model could lay the groundwork for future deep learning architectures focused on leveraging the inherently and inextricably spatio-temporal nature of rs-fMRI data.

Azevedo, T., Campbell, A., Romero-Garcia, R., Passamonti, L., Bethlehem, R., Lio', P., et al. (2022). A deep graph neural network architecture for modelling spatio-temporal dynamics in resting-state functional MRI data. MEDICAL IMAGE ANALYSIS, 79 [10.1016/j.media.2022.102471].

A deep graph neural network architecture for modelling spatio-temporal dynamics in resting-state functional MRI data

Toschi, N
2022-01-01

Abstract

Resting-state functional magnetic resonance imaging (rs-fMRI) has been successfully employed to understand the organisation of the human brain. Typically, the brain is parcellated into regions of interest (ROIs) and modelled as a graph where each ROI represents a node and association measures between ROI-specific blood-oxygen-level-dependent (BOLD) time series are edges. Recently, graph neural networks (GNNs) have seen a surge in popularity due to their success in modelling unstructured relational data. The latest developments with GNNs, however, have not yet been fully exploited for the analysis of rs-fMRI data, particularly with regards to its spatio-temporal dynamics. In this paper, we present a novel deep neural network architecture which combines both GNNs and temporal convolutional networks (TCNs) in order to learn from both the spatial and temporal components of rs-fMRI data in an end-to-end fashion. In particular, this corresponds to intra-feature learning (i.e., learning temporal dynamics with TCNs) as well as inter-feature learning (i.e., leveraging interactions between ROI-wise dynamics with GNNs). We evaluate our model with an ablation study using 35,159 samples from the UK Biobank rs-fMRI database, as well as in the smaller Human Connectome Project (HCP) dataset, both in a unimodal and in a multimodal fashion. We also demonstrate that out architecture contains explainability-related features which easily map to realistic neurobiological insights. We suggest that this model could lay the groundwork for future deep learning architectures focused on leveraging the inherently and inextricably spatio-temporal nature of rs-fMRI data.
2022
Pubblicato
Rilevanza internazionale
Articolo
Esperti anonimi
Settore PHYS-06/A - Fisica per le scienze della vita, l'ambiente e i beni culturali
English
Deep learning
Graph neural networks
UK Biobank
Time series
Temporal convolutional network
Rs-fMRI
Spatio-temporal dynamics
T.A. is funded by the W. D. Armstrong Trust Fund, University of Cam518 bridge, UK. R.A.I.B. is funded by a British Academy Post-Doctoral fellow519 ship and the Autism Research Trust. R.R.G is funded by the Guarantors of Brain. L.P. is funded by the Medical Research Council (MRC) grant (MR/P01271X/1) at the University of Cambridge, UK.The UK Biobank data (application 20904) were curated and analysed using a computational facility funded by an MRC research infrastructure award (MR/M009041/1) and supported by the NIHR Cambridge Biomed525 ical Research Centre and a Marmaduke Shield Award to Dr. Richard A.I.Bethlehem and Varun Warrier. The views expressed are those of the authors and not necessarily those of the NHS, the NIHR or the Department of Health and Social Care.The models in this work were developed and evaluated on two differ530 ent servers. Runs were performed using resources provided by the Cam531 bridge Service for Data Driven Discovery (CSD3) operated by the Uni532 versity of Cambridge Research Computing Service (www.csd3.cam.ac.uk), provided by Dell EMC and Intel using Tier-2 funding from the Engineer534 ing and Physical Sciences Research Council (capital grant EP/P020259/1), and DiRAC funding from the Science and Technology Facilities Council(www.dirac.ac.uk). We also used Titan V GPUs generously donated to N.T. by NVIDIA.
Azevedo, T., Campbell, A., Romero-Garcia, R., Passamonti, L., Bethlehem, R., Lio', P., et al. (2022). A deep graph neural network architecture for modelling spatio-temporal dynamics in resting-state functional MRI data. MEDICAL IMAGE ANALYSIS, 79 [10.1016/j.media.2022.102471].
Azevedo, T; Campbell, A; Romero-Garcia, R; Passamonti, L; Bethlehem, R; Lio', P; Toschi, N
Articolo su rivista
File in questo prodotto:
File Dimensione Formato  
Azavedo-2020-A deep graph MRI.pdf

accesso aperto

Tipologia: Versione Editoriale (PDF)
Licenza: Creative commons
Dimensione 3.69 MB
Formato Adobe PDF
3.69 MB Adobe PDF Visualizza/Apri
Medical image analysis.pdf

accesso aperto

Tipologia: Versione Editoriale (PDF)
Licenza: Creative commons
Dimensione 3.69 MB
Formato Adobe PDF
3.69 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2108/404183
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 43
  • ???jsp.display-item.citation.isi??? 36
social impact