Image rain removal and illumination enhancement done in one go

Yecong Wan, Yuanshuo Cheng, Mingwen Shao*, Jordi Gonzalez Sabate

*Corresponding author for this work

Research output: Contribution to journalArticleResearchpeer-review

6 Citations (Scopus)

Abstract

Rain removal plays an important role in the restoration of degraded images. Recently, CNN-based methods have achieved remarkable success. However, these approaches neglect that the appearance of real-world rain is often accompanied by low light conditions, which will further degrade the image quality, thereby hindering the restoration mission. Therefore, it is very indispensable to jointly remove the rain and enhance illumination for real-world rain image restoration. To this end, we proposed a novel spatially-adaptive network, dubbed SANet, which can remove the rain and enhance illumination in one go with the guidance of degradation mask. Meanwhile, to fully utilize negative samples, a contrastive loss is proposed to preserve more natural textures and consistent illumination. In addition, we present a new synthetic dataset, named DarkRain, to boost the development of rain image restoration algorithms in practical scenarios. DarkRain not only contains different degrees of rain, but also considers different lighting conditions, and more realistically simulates real-world rainfall scenarios. SANet is extensively evaluated on the proposed dataset and attains new state-of-the-art performance against other combining methods. Moreover, after a simple transformation, our SANet surpasses existing the state-of-the-art algorithms in both rain removal and low-light image enhancement.

Original languageEnglish
Article number109244
JournalKnowledge-Based Systems
Volume252
DOIs
Publication statusPublished - 27 Sept 2022

Keywords

  • Contrastive learning
  • Low-light image enhancement
  • Rain removal
  • Spatially-adaptive network

Fingerprint

Dive into the research topics of 'Image rain removal and illumination enhancement done in one go'. Together they form a unique fingerprint.

Cite this