A robust and efficient image de-fencing approach using conditional generative adversarial networks.

Citation metadata

From: Signal, Image and Video Processing(Vol. 15, Issue 2)
Publisher: Springer
Document Type: Report; Brief article
Length: 262 words

Document controls

Main content

Abstract :

Keywords: Automated de-fencing; Two-stage cGAN network; Fence mask detection; Image inpainting Abstract Image de-fencing is one of the most important aspects of recreational photography in which the objective is to remove the fence texture present in an image and generate an aesthetically pleasing version of the same image without the fence texture. In this paper, we present an automated and effective technique for fence removal and image reconstruction using conditional generative adversarial networks (cGANs). These networks have been successfully applied in several other domains of computer vision, focusing on image generation and rendering. Our approach is based on a two-stage architecture involving two cGANs in succession, in which the first cGAN generates the fence mask from an input fenced image, and the next one generates the final de-fenced image from the given input and the corresponding fence mask obtained from the previous cGAN. Training of these networks is carried out independently using suitable loss functions, and during the deployment phase, the above two networks are stacked together in an end-to-end manner to generate the de-fenced image from an unknown test image. Extensive qualitative and quantitative evaluations using challenging data sets emphasize the effectiveness of our approach over state-of-the-art de-fencing techniques. The data sets used in the experiments have also been made available for further comparison. Author Affiliation: (1) Department of Computer Science and Engineering, Indian Institute of Technology (BHU), 221005, Varanasi, India (2) School of Electrical and Electronic Engineering, Nanyang Technological University, 639798, Singapore, Singapore (d) pratik.cse@iitbhu.ac.in Article History: Registration Date: 07/17/2020 Received Date: 02/18/2020 Accepted Date: 07/17/2020 Online Date: 07/30/2020 Byline:

Source Citation

Source Citation   

Gale Document Number: GALE|A651735668