Digital mammography (DM) and digital breast tomosynthesis, the gold standards for breast cancer screening, requires correct breast positioning to ensure accuracy. Improper positioning can result in missed cancers, or can lead to additional imaging. We propose an automated deep learning (DL) segmentation approach to perform multi-class identification of regions of interest (ROI) commonly used for identification of poor positioning in mediolateral oblique (MLO) breast views. We hypothesize that by leveraging the capabilities of DL through the use of the well-founded U-Net model architecture, multi-class DL-based segmentation approaches can accurately identify air, parenchyma, pectoralis, and nipple locations within MLO images. In this study, we employed model hyperparameter searches to determine optimal model parameters for our proposed DL architecture, including the optimal loss function configuration; our best model achieved an average Sørensen-Dice coefficient of 0.919 ± 0.061 on the held-out test set. We identified high levels of localization performance in the nipple ROI. We believe our proposed segmentation model can be a foundational step in further mammogram analysis, such as for breast positioning and localized image processing tools.
Noiseless digital mammograms (DM) are unobtainable in clinical screening environments, limiting the development of deep learning-based (DL) denoising applications. Virtual clinical trials (VCTs) allow the precise simulation of noise levels in DM images for controlled training of DL models. We evaluated a set of DL denoising models, trained using VCT data, that showcases the trade-offs between denoising strength and fine structure preservation. Our results show that metrics, such as peak signal-to-noise ratio (PSNR), are improved with the use of our trained residual convolutional neural network. This quantifiable improvement indicates that our proposed DL methodology can accurately denoise simulated mammograms.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.