The most typical type of cancer among women is breast cancer. Despite the crucial role that digital mammography plays in the early identification of breast cancer, many tumors could not be discriminated on mammography, especially in women with dense breast tissue. Contrast-enhanced magnetic resonance imaging (CE-MRI) of the breast is routinely used to find lesions that are invisible on mammography. MRI-guided biopsies must be used to further analyze these lesions. But MRI-guided biopsy is highly priced, time-consuming, and not frequently accessible. In our earlier work, we introduced a novel method using two methods of registration: biomechanical and image-based registration to transfer lesions from MRI to spot mammograms to allow x-ray guided biopsy. In this paper, we focus on enhancing and developing the image-based registration between full and spot mammograms and analyzing a correlation between the accuracy of our method and features such as views, location of lesion, breast area, size of lesion in each modality, and age. Results for 48 patients from the Medical University of Vienna are provided. The median target registration error is 20.9 mm and the standard deviation is 23.9 mm.
Compared to traditional magnetic resonance imaging, which provides anatomical information with high contrast, diffusion weighted imaging (DWI) can add functional information for a more precise detection and localization of breast cancer. However, DWI may suffer from artifacts due to off-resonance effects, including geometric distortions. This hinders combined view, e.g. by image fusion. In this work, we investigate a distortion correction of DWI based on a nonlinear image registration with a T2 weighted image. Our method consists of three steps: a data cleaning step in which differences in image sections and resolution are compensated, an edge detection step which extracts the outline and inner structures of the breast in both DWI and T2 weight image, and finally a non-rigid registration step using the demons algorithm. We use two clinical datasets with a total of seven patients for evaluation. Manual annotations of landmarks in 227 slices serve as basis to calculate the registration error. Our method reduces the target registration error based on the center of gravity of annotations from in average 5.5 mm to 3.1 mm and is most effective in cases with large initial deformation. Compared to the other methods tested in this study the proposed method shows the lowest error. The method may contribute to a better combined diagnosis and e.g. facilitate computer aided detection and diagnosis by enabling combination of spatially well-aligned information.
In multimodal diagnosis for early breast cancer detection, spatial alignment by means of image registration is an important task. We develop patient-specific biomechanical models of the breast, for which one of the challenges is automatic segmentation for magnetic resonance imaging (MRI) of the breast. In this paper, we propose a novel method using unsupervised neural networks with pre-processing and post-processing to enable automatic breast MRI segmentation for three tissue types simultaneously: fatty, glandular, and muscular tissue. Pre-processing aims at facilitating training of the network. The architecture of neural network is a Kanezaki-net extended to 3D and consists of two sub-networks. Post-processing is enhancing the obtained segmentations by removing common errors. 25 datasets of T2 weighted MRI from the Medical University of Vienna have been evaluated qualitatively by two observers while eight datasets have been evaluated quantitatively based on a ground truth annotated by a medical practitioner. As a result of the qualitative evaluation, 22 out of 25 are usable for biomechanical models. Quantitatively, we achieved an average dice coefficient of 0.88 for fatty tissue, 0.5 for glandular tissue, and 0.86 for muscular tissue. The proposed method can serve as a robust method for automatic generation of biomechanical models.
Breast cancer is the most dominant cancer type among women. Although digital mammography plays an important role in early breast cancer detection, many cancers cannot be distinguished on mammography only, particularly in individuals with dense breast tissue. Lesions not recognizable on mammography are frequently detected by contrast enhanced magnetic resonance imaging (CE-MRI) of the breast. Based on the suspicious characteristics, these lesions need to be further evaluated with MRI-guided biopsy. However, MRI-guided biopsy is costly, time consuming, and not commonly available. In our earlier work, we proposed a novel method for a matching tool between MRI and spot mammograms using a biomechanical model based registration to match MRI and full X-ray mammograms and an image based registration to align full X-ray mammograms and spot mammograms. In this paper, we focus on developing and evaluating methods for image based registration between full X-ray mammograms and spot mammograms. Results assessed for thirteen patients from the Medical University of Vienna are presented. The median target registration error (TRE) of the image based registration is 21.7 mm and the standard deviation is 9.3 mm.
Breast cancer is the most common cancer type among women. Approximately 40,000 women are expected to die from breast cancer every year. While digital mammography has a central role in the early diagnosis of breast cancer, many cancers are not visible in mammography, for example in women with dense breast tissue. Contrast enhanced magnetic resonance imaging (CE-MRI) of the breast is often used to detect lesions not visible in mammography. Lesions with suspicious characteristics on CE-MRI need to be further assessed with MRI-guided biopsy. However, MRI-guided biopsy is expensive, time consuming, and not widely available. In this paper, a novel method for a matching tool between MRI and spot mammograms is proposed. Our aim is to transfer information that is only visible in MRI onto mammographic spot projections, to enable X-ray guided biopsy even if the lesion is only visible in MRI. Two methods of registration in combination are used; a biomechanical model based registration between MRI and full view X-ray mammograms and a subsequent image based registration between full mammograms and spot mammograms. Preliminary results assessed for one patient from the Medical University of Vienna are presented. The target registration error (TRE) of biomechanical model based registration is 2.4 mm and the TRE of the image based registration is 9.5 mm. The total TRE of the two steps is 7.3 mm.
In this study we used a large previously built database of 2,892 mammograms and 31,650 single mammogram radiologists’ assessments to simulate the impact of replacing one radiologist by an AI system in a double reading setting. The double human reading scenario and the double hybrid reading scenario (second reader replaced by an AI system) were simulated via bootstrapping using different combinations of mammograms and radiologists from the database. The main outcomes of each scenario were sensitivity, specificity and workload (number of necessary readings). The results showed that when using AI as a second reader, workload can be reduced by 44%, sensitivity remains similar (difference -0.1%; 95% CI = - 4.1%, 3.9%), and specificity increases by 5.3% (P<0.001). Our results suggest that using AI as a second reader in a double reading setting as in screening programs could be a strategy to reduce workload and false positive recalls without affecting sensitivity.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.