Hematoxylin and Eosin (H&E) are one of the main tissue stains used in histopathology to discriminate between nuclei and extracellular material while performing a visual analysis of the tissue. However, histopathology slides are often characterized by stain color heterogeneity, due to different tissue preparation settings at different pathology institutes. Stain color heterogeneity poses challenges for machine learning-based computational analysis, increasing the difficulty of producing consistent diagnostic results and systems that generalize well. In other words, it is challenging for a deep learning architecture to generalize on stain color heterogeneous data, when the data are acquired at several centers, and particularly if test data are from a center not present in the training data. In this paper, several methods that deal with stain color heterogeneity are compared regarding their capability to solve center-dependent heterogeneity. Systematic and extensive experimentation is performed on a normal versus tumor tissue classification problem. Stain color normalization and augmentation procedures are used while training a convolutional neural networks (CNN) to generalize on unseen data from several centers. The performance is compared on an internal test set (test data from the same pathology institutes as the training set) and an external test set (test data from institutes not included in the training set). This also allows to measure generalization performance. An improved performance is observed when the predictions of the two best-performed stain color normalization methods with augmentation are aggregated. An average AUC and F1-score on external test are observed as 0:892±0:021 and 0:817±0:032 compared to the baseline 0:860±0:027 and 0:772 ± 0:024 respectively.
Prostate cancer (PCa) is one of the most frequent cancers in men. Its grading is required before initiating its treatment. The Gleason Score (GS) aims at describing and measuring the regularity in gland patterns observed by a pathologist on the microscopic or digital images of prostate biopsies and prostatectomies. Deep Learning based (DL) models are the state-of-the-art computer vision techniques for Gleason grading, learning high-level features with high classification power. However, for obtaining robust models with clinical-grade performance, a large number of local annotations are needed. Previous research showed that it is feasible to detect low and high-grade PCa from digitized tissue slides relying only on the less expensive report{level (weakly) supervised labels, thus global rather than local labels. Despite this, few articles focus on classifying the finer-grained GS classes with weakly supervised models. The objective of this paper is to compare weakly supervised strategies for classification of the five classes of the GS from the whole slide image, using the global diagnostic label from the pathology reports as the only source of supervision. We compare different models trained on handcrafted features, shallow and deep learning representations. The training and evaluation are done on the publicly available TCGA-PRAD dataset, comprising of 341 whole slide images of radical prostatectomies, where small patches are extracted within tissue areas and assigned the global report label as ground truth. Our results show that DL networks and class-wise data augmentation outperform other strategies and their combinations, reaching a kappa score of κ = 0:44, which could be further improved with a larger dataset or combining both strong and weakly supervised models.
The Gleason grading system was developed for assessing prostate histopathology slides. It is correlated to the
outcome and incidence of relapse in prostate cancer. Although this grading is part of a standard protocol
performed by pathologists, visual inspection of whole slide images (WSIs) has an inherent subjectivity when
evaluated by different pathologists. Computer aided pathology has been proposed to generate an objective and
reproducible assessment that can help pathologists in their evaluation of new tissue samples. Deep convolutional
neural networks are a promising approach for the automatic classification of histopathology images and can
hierarchically learn subtle visual features from the data. However, a large number of manual annotations from
pathologists are commonly required to obtain sufficient statistical generalization when training new models that
can evaluate the daily generated large amounts of pathology data. A fully automatic approach that detects
prostatectomy WSIs with high–grade Gleason score is proposed. We evaluate the performance of various deep
learning architectures training them with patches extracted from automatically generated regions–of–interest
rather than from manually segmented ones. Relevant parameters for training the deep learning model such as
size and number of patches as well as the inclusion or not of data augmentation are compared between the tested
deep learning architectures. 235 prostate tissue WSIs with their pathology report from the publicly available
TCGA data set were used. An accuracy of 78% was obtained in a balanced set of 46 unseen test images with
different Gleason grades in a 2–class decision: high vs. low Gleason grade. Grades 7–8, which represent the
boundary decision of the proposed task, were particularly well classified. The method is scalable to larger data
sets with straightforward re–training of the model to include data from multiple sources, scanners and acquisition
techniques. Automatically generated heatmaps for theWSIs could be useful for improving the selection of patches
when training networks for big data sets and to guide the visual inspection of these images.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.