EvDehaze: Dynamic Scene Dehazing via Event Camera

Abstract

Clear imaging under hazy conditions is a critical task. To address the limitations of traditional dehazing models based solely on RGB sensors, we propose EvDehaze, the first event-guided diffusion framework for image dehazing. Leveraging the high dynamic range and temporal precision of event cameras, EvDehaze conditions a latent diffusion model with event-derived edge and illumination features. A newly collected real-world dataset with synchronized RGB-event data under heavy haze is also introduced. Our method achieves state-of-the-art performance across synthetic and real-world benchmarks.

Motivation and Overview

overview

Method: EvDehaze Framework

architecture

Event-Based Real-World Dataset

real dataset

Qualitative Comparison

compare sots1 sots2

Real Event Evaluation

real event

Simulated Event from RESIDE

reside event