Paper
31 January 2020 Evaluating CNN interpretability on sketch classification
Abraham Theodorus, Meike Nauta, Christin Seifert
Author Affiliations +
Proceedings Volume 11433, Twelfth International Conference on Machine Vision (ICMV 2019); 114331Q (2020) https://doi.org/10.1117/12.2559536
Event: Twelfth International Conference on Machine Vision, 2019, Amsterdam, Netherlands
Abstract
While deep neural networks (DNNs) have been shown to outperform humans on many vision tasks, their intransparent decision making process inhibits wide-spread uptake, especially in high-risk scenarios. The BagNet architecture was designed to learn visual features that are easier to explain than the feature representation of other convolutional neural networks (CNNs). Previous experiments with BagNet were focused on natural images providing rich texture and color information. In this paper, we investigate the performance and interpretability of BagNet on a data set of human sketches, i.e., a data set with limited color and no texture information. We also introduce a heatmap interpretability score (HI score) to quantify model interpretability and present a user study to examine BagNet interpretability from user perspective. Our results show that BagNet is by far the most interpretable CNN architecture in our experiment setup based on the HI score.
© (2020) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Abraham Theodorus, Meike Nauta, and Christin Seifert "Evaluating CNN interpretability on sketch classification", Proc. SPIE 11433, Twelfth International Conference on Machine Vision (ICMV 2019), 114331Q (31 January 2020); https://doi.org/10.1117/12.2559536
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Visualization

Performance modeling

Neural networks

Artificial intelligence

Convolutional neural networks

RELATED CONTENT


Back to Top