Paper

TAFNet: A Three-Stream Adaptive Fusion Network for RGB-T Crowd Counting

In this paper, we propose a three-stream adaptive fusion network named TAFNet, which uses paired RGB and thermal images for crowd counting. Specifically, TAFNet is divided into one main stream and two auxiliary streams. We combine a pair of RGB and thermal images to constitute the input of main stream. Two auxiliary streams respectively exploit RGB image and thermal image to extract modality-specific features. Besides, we propose an Information Improvement Module (IIM) to fuse the modality-specific features into the main stream adaptively. Experiment results on RGBT-CC dataset show that our method achieves more than 20% improvement on mean average error and root mean squared error compared with state-of-the-art method. The source code will be publicly available at https://github.com/TANGHAIHAN/TAFNet.

Results in Papers With Code
(↓ scroll down to see all results)