Image Shadow Removal Based on Improved Generative Adversarial Network
Keywords:
Deep learning, Generative Adversarial Network, Image processing, Shadow removal, Shadow detectionAbstract
Traditional deep learning-based shadow removal methods often alter the pixels in non-shadow areas and fail to produce shadow removal results with natural boundary transitions. To solve this problem, we propose a novel multi-stage shadow removal framework based on Generative Adversarial Networks (GAN). Firstly, the multi-task-driven generator respectively generates the corresponding shadow mask and shadow mask for the input image through the shadow detection subnet and the mask generation subnet. Secondly, guided by the shadow mask and shadow mask, the full shadow module and the partial shadow module are respectively designed, and different types of shadows in the image are removed in stages. The sequential data from multiple sensors is used as the input of the time series generative adversarial network to generate sequence data with temporal dynamic characteristics; the data synthesized by the time series generative adversarial network is used to replace the noise input data in the gradient-penalized WGAN generative adversarial network, and the discriminator combines graph convolutional networks, long short-term memory networks and attention mechanisms to more effectively explore the spatio-temporal correlations of multi-source heterogeneous sensing data and enhance the discriminative ability for sequential data. Then, a new combined loss function was constructed based on the least squares loss to achieve better results. Compared with the latest deep learning shadow removal methods, on the selected dataset, the balance error rate of the proposed method decreases by approximately 4.39%, the structural similarity increased by approximately 0.44%, and the pixel root mean square error decreases by approximately 13.32%. The experimental results show that the shadow removal results obtained by this method have smoother boundary transitions.