Conventional pathology workflows rely on two-dimensional, slide-based analysis of thin tissue sections. This approach comes with several key limitations including limited sampling, lack of 3D structural information, and destruction of valuable clinical specimens. There is growing interest in nondestructive 3D pathology to address these shortcomings. Existing work has mainly focused on small-scale proof-of-concept studies, due in part to the difficulty of producing consistent, high-quality 3D pathology datasets across hundreds to thousands of specimens. To facilitate large-scale clinical studies, we present an end-to-end workflow for 3D pathology, with an emphasis on data consistency and quality control.
KEYWORDS: Image segmentation, 3D modeling, Education and training, 3D image processing, Prostate, Data modeling, Biopsy, Pathology, Prostate cancer, Performance modeling
SignificanceIn recent years, we and others have developed non-destructive methods to obtain three-dimensional (3D) pathology datasets of clinical biopsies and surgical specimens. For prostate cancer risk stratification (prognostication), standard-of-care Gleason grading is based on examining the morphology of prostate glands in thin 2D sections. This motivates us to perform 3D segmentation of prostate glands in our 3D pathology datasets for the purposes of computational analysis of 3D glandular features that could offer improved prognostic performance.AimTo facilitate prostate cancer risk assessment, we developed a computationally efficient and accurate deep learning model for 3D gland segmentation based on open-top light-sheet microscopy datasets of human prostate biopsies stained with a fluorescent analog of hematoxylin and eosin (H&E).ApproachFor 3D gland segmentation based on our H&E-analog 3D pathology datasets, we previously developed a hybrid deep learning and computer vision-based pipeline, called image translation-assisted segmentation in 3D (ITAS3D), which required a complex two-stage procedure and tedious manual optimization of parameters. To simplify this procedure, we use the 3D gland-segmentation masks previously generated by ITAS3D as training datasets for a direct end-to-end deep learning-based segmentation model, nnU-Net. The inputs to this model are 3D pathology datasets of prostate biopsies rapidly stained with an inexpensive fluorescent analog of H&E and the outputs are 3D semantic segmentation masks of the gland epithelium, gland lumen, and surrounding stromal compartments within the tissue.ResultsnnU-Net demonstrates remarkable accuracy in 3D gland segmentations even with limited training data. Moreover, compared with the previous ITAS3D pipeline, nnU-Net operation is simpler and faster, and it can maintain good accuracy even with lower-resolution inputs.ConclusionsOur trained DL-based 3D segmentation model will facilitate future studies to demonstrate the value of computational 3D pathology for guiding critical treatment decisions for patients with prostate cancer.
Esophageal adenocarcinoma (EAC), which can arise from Barrett’s esophagus (BE), has a 5-year survival rate of < 20%. Unfortunately, the severe sampling limitations associated with conventional histology may limit the sensitivity for detecting EAC and dysplasia (a precursor lesion to EAC) through regular endoscopic screening of BE patients. We have developed a non-destructive 3D pathology workflow to provide comprehensive evaluation of whole biopsies and a deep learning-based computational triage method that automatically segments potentially neoplastic regions (dysplasia or EAC) to guide pathologist review. A preliminary clinical validation study shows that our AI-assisted 3D workflow enables neoplasia to be identified with higher sensitivity on a per-biopsy level than conventional slide-based 2D histology.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.