Self-supervised learning (SSL) has become a crucial approach for pre-training deep learning models in natural and medical image analysis. However, applying transformations designed for natural images to three-dimensional (3D) medical data poses challenges. This study explores the efficacy of specific augmentations in the context of self-supervised pre-training for volumetric medical images. A 3D non-contrastive framework is proposed for in-domain self-supervised pre-training on 3D gray-scale thorax CT data, incorporating four spatial and two intensity augmentations commonly used in 3D medical image analysis. The pre-trained models, adapted versions of ResNet-50 and Vision Transformer (ViT)-S, are evaluated on lung nodule classification and lung tumor segmentation tasks. The results indicate a significant impact of SSL, with a remarkable increase in AUC and DSC as compared to training from scratch. For classification, random scalings and random rotations play a fundamental role in achieving higher downstream performance, while intensity augmentations show limited contribution and may even degrade performance. For segmentation, random intensity histogram shifting enhances robustness, while other augmentations have marginal or negative impacts. These findings underscore the necessity of tailored data augmentations within SSL for medical imaging, emphasizing the importance of task-specific transformations for optimal model performance in complex 3D medical datasets.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.