Paper
1 June 2020 Texture reconstruction based on underlying pattern modification that reflects user's intention
Author Affiliations +
Proceedings Volume 11515, International Workshop on Advanced Imaging Technology (IWAIT) 2020; 115150D (2020) https://doi.org/10.1117/12.2566900
Event: International Workshop on Advanced Imaging Technologies 2020 (IWAIT 2020), 2020, Yogyakarta, Indonesia
Abstract
This paper presents a method for reconstructing texture patterns using deep learning. The proposed method is based on a deep neural network called pix2pix generative adversarial network (GAN) that is able to learn the conversion process between input and output images. It extends the pix2pix by adding constraints to the network to change the underline image pattern while retaining the input fine texture. Using texture images with underlying patterns and fine textures as test data, we verified the effectiveness of our modification through several computational experiments. Although the generated images can keep the input color and edge information, these images are blurred, and the input texture information cannot be reproduced in some cases. The latter problems need to be improved in the future research direction.
© (2020) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Mami Nagoya, Tomoaki Kimura, and Hiroyuki Tsuji "Texture reconstruction based on underlying pattern modification that reflects user's intention", Proc. SPIE 11515, International Workshop on Advanced Imaging Technology (IWAIT) 2020, 115150D (1 June 2020); https://doi.org/10.1117/12.2566900
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Neural networks

RGB color model

Computer graphics

Image processing

Back to Top