Paper
18 March 2022 A new attention mechanism based on inner product for medical image classification
Zimeng Chi, Guiqing Zhang, Xiaoyong Guo
Author Affiliations +
Proceedings Volume 12168, International Conference on Computer Graphics, Artificial Intelligence, and Data Processing (ICCAID 2021); 121680O (2022) https://doi.org/10.1117/12.2631117
Event: International Conference on Computer Graphics, Artificial Intelligence, and Data Processing (ICCAID 2021), 2021, Harbin, China
Abstract
This paper introduces a novel method to refine the features in a deep learning (DL) model which is built for medical image classification. By framing it as a fine-grained visual categorization task, we propose an attention mechanism based on the inner product between feature column vector and the learnable attention vector. The proposed method is conducted on two open-source benchmark datasets. The results of the experiment suggest that the model be quite stable and accurate under the indicators of accuracy, precision, recall, and F1-score. It is found that inserting the inner product attention not only enhances the overall classification accuracy and efficiency, but also improves the recognition ability of learned features. Indeed, employing the proposed attention mechanism can increase the feature extraction capability of the DL models.
© (2022) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Zimeng Chi, Guiqing Zhang, and Xiaoyong Guo "A new attention mechanism based on inner product for medical image classification", Proc. SPIE 12168, International Conference on Computer Graphics, Artificial Intelligence, and Data Processing (ICCAID 2021), 121680O (18 March 2022); https://doi.org/10.1117/12.2631117
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Medical imaging

Data modeling

Image classification

Performance modeling

X-ray imaging

X-rays

Visualization

Back to Top