Search Results for author: Hanlin Qin

Found 5 papers, 1 papers with code

DestripeCycleGAN: Stripe Simulation CycleGAN for Unsupervised Infrared Image Destriping

no code implementations14 Feb 2024 Shiqi Yang, Hanlin Qin, Shuai Yuan, Xiang Yan, Hossein Rahmani

However, when applied to the infrared destriping task, it becomes challenging for the vanilla auxiliary generator to consistently produce vertical noise under unsupervised constraints.

Denoising Image Restoration

SCTransNet: Spatial-channel Cross Transformer Network for Infrared Small Target Detection

1 code implementation28 Jan 2024 Shuai Yuan, Hanlin Qin, Xiang Yan, Naveed Akhtar, Ajmal Mian

In the proposed SCTBs, the outputs of all encoders are interacted with cross transformer to generate mixed features, which are redistributed to all decoders to effectively reinforce semantic differences between the target and clutter at full scales.

ARCNet: An Asymmetric Residual Wavelet Column Correction Network for Infrared Image Destriping

no code implementations28 Jan 2024 Shuai Yuan, Hanlin Qin, Xiang Yan, Naveed Akhtar, Shiqi Yang, Shuowen Yang

Our neural model leverages a novel downsampler, residual haar discrete wavelet transform (RHDWT), stripe directional prior knowledge and data-driven learning to induce a model with enriched feature representation of stripe noise and background.

Feature Upsampling Image Reconstruction

Unsupervised Deep Multi-focus Image Fusion

no code implementations19 Jun 2018 Xiang Yan, Syed Zulqarnain Gilani, Hanlin Qin, Ajmal Mian

Convolutional neural networks have recently been used for multi-focus image fusion.

SSIM

Deep Keyframe Detection in Human Action Videos

no code implementations26 Apr 2018 Xiang Yan, Syed Zulqarnain Gilani, Hanlin Qin, Mingtao Feng, Liang Zhang, Ajmal Mian

Detecting representative frames in videos based on human actions is quite challenging because of the combined factors of human pose in action and the background.

Cannot find the paper you are looking for? You can Submit a new open access paper.