Presentation
5 March 2021 Head and neck cancer visualization using deep learning combined with fluorescence lifetime imaging and white light imaging
Takanori Fukazawa, Mark Marsden, Brent W. Weyers, Yu-Cheng Deng, Julien Bec, D. Gregory Farwell, Laura Marcu
Author Affiliations +
Abstract
A key step for mitigating tumor recurrence for patients with head and neck cancer is adequate surgical margin delineation. Presently available techniques however limit accurate tumor margin detection during surgery. Herein, we report on tumor visualization using deep learning by combining autofluorescence images acquired by a fiber-based fluorescence lifetime imaging (FLIm) system and white light images (WLI) obtained by surgical cameras. To accomplish accurate registration between FLIm and WLI, a tissue motion correction algorithm was employed as a pre-processing step. The trained model was applied to differentiation of healthy and cancerous tissues in a 50 head and neck cancer patients dataset (ROC-AUC : 0.87).
Conference Presentation
© (2021) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Takanori Fukazawa, Mark Marsden, Brent W. Weyers, Yu-Cheng Deng, Julien Bec, D. Gregory Farwell, and Laura Marcu "Head and neck cancer visualization using deep learning combined with fluorescence lifetime imaging and white light imaging", Proc. SPIE 11631, Advanced Biomedical and Clinical Diagnostic and Surgical Guidance Systems XIX, 116310O (5 March 2021); https://doi.org/10.1117/12.2581449
Advertisement
Advertisement
KEYWORDS
Tumors

Cancer

Fluorescence lifetime imaging

Visualization

Algorithm development

Optical sensors

Pathology

Back to Top