In medical image datasets, discrete labels are often used to describe a continuous spectrum of conditions, making unsupervised image stratification a challenging task. In this work, we propose VAESim, an architecture for image stratification based on a conditional variational autoencoder. VAESim learns a set of prototypical vectors during training, each associated with a cluster in a continuous latent space. We perform a soft assignment of each data sample to the clusters and reconstruct the sample based on a similarity measure between the sample embedding and the prototypical vectors. to update the prototypical embeddings, we use an exponential moving average of the most similar representations between actual prototypes and samples in the batch size. We test our approach on the MNIST handwritten digit dataset and the pneumoniaMNIST medical benchmark dataset, where we show that our method outperforms baselines in terms of kNN accuracy (up to +15% improvement in performance) and performs at par with classification models trained in a fully supervised way. our model also outperforms current end-to-end models for unsupervised stratification.
Ferrante, M., Boccato, T., Spasov, S., Duggento, A., Toschi, N. (2023). VAESim: A probabilistic approach for self-supervised prototype discovery. IMAGE AND VISION COMPUTING, 137, 104746 [10.1016/j.imavis.2023.104746].
VAESim: A probabilistic approach for self-supervised prototype discovery
Duggento, A;Toschi, N
2023-01-01
Abstract
In medical image datasets, discrete labels are often used to describe a continuous spectrum of conditions, making unsupervised image stratification a challenging task. In this work, we propose VAESim, an architecture for image stratification based on a conditional variational autoencoder. VAESim learns a set of prototypical vectors during training, each associated with a cluster in a continuous latent space. We perform a soft assignment of each data sample to the clusters and reconstruct the sample based on a similarity measure between the sample embedding and the prototypical vectors. to update the prototypical embeddings, we use an exponential moving average of the most similar representations between actual prototypes and samples in the batch size. We test our approach on the MNIST handwritten digit dataset and the pneumoniaMNIST medical benchmark dataset, where we show that our method outperforms baselines in terms of kNN accuracy (up to +15% improvement in performance) and performs at par with classification models trained in a fully supervised way. our model also outperforms current end-to-end models for unsupervised stratification.File | Dimensione | Formato | |
---|---|---|---|
VAESIM.pdf
accesso aperto
Tipologia:
Versione Editoriale (PDF)
Licenza:
Creative commons
Dimensione
4.93 MB
Formato
Adobe PDF
|
4.93 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.