Papers
arxiv:2504.09621

Tokenize Image Patches: Global Context Fusion for Effective Haze Removal in Large Images

Published on Apr 13
· Submitted by fengyanzi on Apr 21
Authors:
,
,

Abstract

Global contextual information and local detail features are essential for haze removal tasks. Deep learning models perform well on small, low-resolution images, but they encounter difficulties with large, high-resolution ones due to GPU memory limitations. As a compromise, they often resort to image slicing or downsampling. The former diminishes global information, while the latter discards high-frequency details. To address these challenges, we propose DehazeXL, a haze removal method that effectively balances global context and local feature extraction, enabling end-to-end modeling of large images on mainstream GPU hardware. Additionally, to evaluate the efficiency of global context utilization in haze removal performance, we design a visual attribution method tailored to the characteristics of haze removal tasks. Finally, recognizing the lack of benchmark datasets for haze removal in large images, we have developed an ultra-high-resolution haze removal dataset (8KDehaze) to support model training and testing. It includes 10000 pairs of clear and hazy remote sensing images, each sized at 8192 times 8192 pixels. Extensive experiments demonstrate that DehazeXL can infer images up to 10240 times 10240 pixels with only 21 GB of memory, achieving state-of-the-art results among all evaluated methods. The source code and experimental dataset are available at https://github.com/CastleChen339/DehazeXL.

Community

Paper author Paper submitter

In this paper, we propose DehazeXL, an end-to-end haze removal method that effectively integrates global information interaction with local feature extraction, enabling efficient processing of large images while minimizing GPU memory usage. To evaluate the efficiency of global context utilization in haze removal performance, we designed a visual attribution method for haze removal tasks. Furthermore, we develop an ultra-high-resolution haze removal dataset 8KDehaze. Quantitative and qualitative results demonstrate that the proposed DehazeXL outperforms state-of-the-art haze removal techniques in terms of accuracy and inference speed across multiple high-resolution datasets. The results of attribution analysis highlight the significance of global information in the context of image dehazing tasks.

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2504.09621 in a model README.md to link it from this page.

Datasets citing this paper 1

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2504.09621 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.