Abstract
Recent years have witnessed significant advances in image deraining due to the kinds of effective image priors and deep learning models. As each deraining approach has individual settings (e.g., training and test datasets, evaluation criteria), how to fairly evaluate existing approaches comprehensively is not a trivial task. Although existing surveys aim to review of image deraining approaches comprehensively, few of them focus on providing unify evaluation settings to examine the deraining capability and practicality evaluation. In this paper, we provide a comprehensive review of existing image deraining method and provide a unify evaluation setting to evaluate the performance of image deraining methods. We construct a new high-quality benchmark named HQ-RAIN to further conduct extensive evaluation, consisting of 5,000 paired high-resolution synthetic images with higher harmony and realism. We also discuss the existing challenges and highlight several future research opportunities worth exploring. To facilitate the reproduction and tracking of the latest deraining technologies for general users, we build an online platform to provide the off-the-shelf toolkit, involving the large-scale performance evaluation. This online platform and the proposed new benchmark are publicly available and will be regularly updated at http://www.deraining.tech/.
Paper
Xiang Chen, Jinshan Pan, Jiangxin Dong, Jinhui Tang. "Towards Unified Deep Image Deraining: A Survey and A New Benchmark", arXiv preprint arXiv:2310.03535 (arXiv), 2023.
[PDF]
[Bibtex]
@article{chen2023survey,
title={Towards Unified Deep Image Deraining: A Survey and A New Benchmark},
author={Chen, Xiang and Pan, Jinshan and Dong, Jiangxin and Tang, Jinhui},
journal={arXiv preprint arXiv:2105.15077},
year={2023}
}
News
- 2023.12.04: The proposed HQ-RAIN is available.
- 2023.10.06: The paper is available [here].
- 2023.10.01: The online platform is avaliable.
Acknowledgment
Part of the code for this platform borrows from SIDD.
License
The all online resources are under the MIT License.
Contact
For any questions, remarks, comments, or collaborations, please contact: chenxiang@njust.edu.cn.