Search Results for author: Zhijun Tu

Found 8 papers, 3 papers with code

LIPT: Latency-aware Image Processing Transformer

no code implementations9 Apr 2024 Junbo Qiao, Wei Li, Haizhen Xie, Hanting Chen, Yunshuai Zhou, Zhijun Tu, Jie Hu, Shaohui Lin

Extensive experiments on multiple image processing tasks (e. g., image super-resolution (SR), JPEG artifact reduction, and image denoising) demonstrate the superiority of LIPT on both latency and PSNR.

Image Denoising Image Super-Resolution

IPT-V2: Efficient Image Processing Transformer using Hierarchical Attentions

no code implementations31 Mar 2024 Zhijun Tu, Kunpeng Du, Hanting Chen, Hailing Wang, Wei Li, Jie Hu, Yunhe Wang

Recent advances have demonstrated the powerful capability of transformer architecture in image restoration.

Deblurring Denoising +3

A Survey on Transformer Compression

no code implementations5 Feb 2024 Yehui Tang, Yunhe Wang, Jianyuan Guo, Zhijun Tu, Kai Han, Hailin Hu, DaCheng Tao

Model compression methods reduce the memory and computational cost of Transformer, which is a necessary step to implement large language/vision models on practical devices.

Knowledge Distillation Model Compression +1

CBQ: Cross-Block Quantization for Large Language Models

no code implementations13 Dec 2023 Xin Ding, Xiaoyu Liu, Zhijun Tu, Yun Zhang, Wei Li, Jie Hu, Hanting Chen, Yehui Tang, Zhiwei Xiong, Baoqun Yin, Yunhe Wang

Post-training quantization (PTQ) has played a key role in compressing large language models (LLMs) with ultra-low costs.

Quantization

Data Upcycling Knowledge Distillation for Image Super-Resolution

1 code implementation25 Sep 2023 Yun Zhang, Wei Li, Simiao Li, Hanting Chen, Zhijun Tu, Wenjia Wang, BingYi Jing, Shaohui Lin, Jie Hu

Knowledge distillation (KD) compresses deep neural networks by transferring task-related knowledge from cumbersome pre-trained teacher models to compact student models.

Image Super-Resolution Knowledge Distillation +1

Toward Accurate Post-Training Quantization for Image Super Resolution

2 code implementations CVPR 2023 Zhijun Tu, Jie Hu, Hanting Chen, Yunhe Wang

In this paper, we study post-training quantization(PTQ) for image super resolution using only a few unlabeled calibration images.

Image Super-Resolution Quantization

AdaBin: Improving Binary Neural Networks with Adaptive Binary Sets

3 code implementations17 Aug 2022 Zhijun Tu, Xinghao Chen, Pengju Ren, Yunhe Wang

Since the modern deep neural networks are of sophisticated design with complex architecture for the accuracy reason, the diversity on distributions of weights and activations is very high.

Classification with Binary Neural Network Quantization

Superconductivity and normal-state properties of kagome metal RbV3Sb5 single crystals

no code implementations25 Jan 2021 Qiangwei Yin, Zhijun Tu, Chunsheng Gong, Yang Fu, Shaohua Yan, Hechang Lei

We report the discovery of superconductivity and detailed normal-state physical properties of RbV3Sb5 single crystals with V kagome lattice.

Superconductivity Materials Science

Cannot find the paper you are looking for? You can Submit a new open access paper.