Recent Foundation Models have begun to yield remarkable successes across various downstream medical imaging applications. Yet, their potential within the context of multi-view medical image analysis remains largely unexplored. This research aims to investigate the feasibility of leveraging foundation models for predicting breast cancer from multiview mammograms through parameter-efficient transfer learning (PETL). PETL was implemented by inserting lightweight adapter modules into existing pre-trained transformer models. During model training, the parameters of the adapters were updated while the pre-trained weights of the foundation model remained fixed. To assess the model's performance, we retrospectively assembled a dataset of 949 patients, with 470 malignant cases and 479 normal or benign cases. Each patient has four mammograms obtained from two views (CC/MLO) of both the right and left breasts. The large foundation model with 328 million (M) parameters, finetuned with adapters comprising only 3.2M tunable parameters (about 1% of the total model parameters), achieved a classification accuracy of 78.9% ± 1.7%. This performance was competitive but slightly inferior to a smaller model with 36M parameters, finetuned using traditional methods, which attained an accuracy of 80.4% ± 0.9%. The results suggest that while foundation models possess considerable potential, their efficacy in medium-sized datasets and in transitioning from single-view to multi-view image analysis, particularly where reasoning feature relationships across different mammographic views is crucial, can be challenging. This underscores the need for innovative transfer learning approaches to better adapt and generalize foundation models for the complex requirements of multi-view medical image analysis.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.