Significance: Morphological changes in the epidermis layer are critical for the diagnosis and assessment of various skin diseases. Due to its noninvasiveness, optical coherence tomography (OCT) is a good candidate for observing microstructural changes in skin. Convolutional neural network (CNN) has been successfully used for automated segmentation of the skin layers of OCT images to provide an objective evaluation of skin disorders. Such method is reliable, provided that a large amount of labeled data is available, which is very time-consuming and tedious. The scarcity of patient data also puts another layer of difficulty to make the model more generalizable.Aim: We developed a semisupervised representation learning method to provide data augmentations.Approach: We used rodent models to train neural networks for accurate segmentation of clinical data.Result: The learning quality is maintained with only one OCT labeled image per volume that is acquired from patients. Data augmentation introduces a semantically meaningful variance, allowing for better generalization. Our experiments demonstrate the proposed method can achieve accurate segmentation and thickness measurement of the epidermis.Conclusion: This is the first report of semisupervised representative learning applied to OCT images from clinical data by making full use of the data acquired from rodent models. The proposed method promises to aid in the clinical assessment and treatment planning of skin diseases.
Significance: In order to elucidate therapeutic treatment to accelerate wound healing, it is crucial to understand the process underlying skin wound healing, especially re-epithelialization. Epidermis and scab detection is of importance in the wound healing process as their thickness is a vital indicator to judge whether the re-epithelialization process is normal or not. Since optical coherence tomography (OCT) is a real-time and non-invasive imaging technique that can perform a cross-sectional evaluation of tissue microstructure, it is an ideal imaging modality to monitor the thickness change of epidermal and scab tissues during wound healing processes in micron-level resolution. Traditional segmentation on epidermal and scab regions was performed manually, which is time-consuming and impractical in real time.
Aim: We aim to develop a deep-learning-based skin layer segmentation method for automated quantitative assessment of the thickness of in vivo epidermis and scab tissues during a time course of healing within a rodent model.
Approach: Five convolution neural networks were trained using manually labeled epidermis and scab regions segmentation from 1000 OCT B-scan images (assisted by its corresponding angiographic information). The segmentation performance of five segmentation architectures was compared qualitatively and quantitatively for validation set.
Results: Our results show higher accuracy and higher speed of the calculated thickness compared with human experts. The U-Net architecture represents a better performance than other deep neural network architectures with 0.894 at F1-score, 0.875 at mean intersection over union, 0.933 at Dice similarity coefficient, and 18.28 μm at an average symmetric surface distance. Furthermore, our algorithm is able to provide abundant quantitative parameters of the wound based on its corresponding thickness maps in different healing phases. Among them, normalized epidermal thickness is recommended as an essential hallmark to describe the re-epithelialization process of the rodent model.
Conclusions: The automatic segmentation and thickness measurements within different phases of wound healing data demonstrates that our pipeline provides a robust, quantitative, and accurate method for serving as a standard model for further research into effect of external pharmacological and physical factors.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.