Search Results for author: Zhengfa Liang

Found 6 papers, 3 papers with code

Gated Cross-Attention Network for Depth Completion

no code implementations28 Sep 2023 Xiaogang Jia, Songlei Jian, Yusong Tan, Yonggang Che, Wei Chen, Zhengfa Liang

With a simple yet efficient gating mechanism, our proposed method achieves fast and accurate depth completion without the need for additional branches or post-processing steps.

Autonomous Driving Depth Completion +1

Pseudo-label Correction and Learning For Semi-Supervised Object Detection

no code implementations6 Mar 2023 Yulin He, Wei Chen, Ke Liang, Yusong Tan, Zhengfa Liang, Yulan Guo

Our proposed method, Pseudo-label Correction and Learning (PCL), is extensively evaluated on the MS COCO and PASCAL VOC benchmarks.

object-detection Object Detection +2

Parallax Attention for Unsupervised Stereo Correspondence Learning

1 code implementation16 Sep 2020 Longguang Wang, Yulan Guo, Yingqian Wang, Zhengfa Liang, Zaiping Lin, Jungang Yang, Wei An

Based on our PAM, we propose a parallax-attention stereo matching network (PASMnet) and a parallax-attention stereo image super-resolution network (PASSRnet) for stereo matching and stereo image super-resolution tasks.

Stereo Image Super-Resolution Stereo Matching

Learning Parallax Attention for Stereo Image Super-Resolution

1 code implementation CVPR 2019 Longguang Wang, Yingqian Wang, Zhengfa Liang, Zaiping Lin, Jungang Yang, Wei An, Yulan Guo

Stereo image pairs can be used to improve the performance of super-resolution (SR) since additional information is provided from a second viewpoint.

Stereo Image Super-Resolution

Learning for Disparity Estimation through Feature Constancy

2 code implementations CVPR 2018 Zhengfa Liang, Yiliu Feng, Yulan Guo, Hengzhu Liu, Wei Chen, Linbo Qiao, Li Zhou, Jianfeng Zhang

The second part performs matching cost calculation, matching cost aggregation and disparity calculation to estimate the initial disparity using shared features.

Disparity Estimation Stereo Matching +1

Cannot find the paper you are looking for? You can Submit a new open access paper.