Search Results for author: Naiyuan Liu

Found 6 papers, 5 papers with code

ReLER@ZJU-Alibaba Submission to the Ego4D Natural Language Queries Challenge 2022

1 code implementation1 Jul 2022 Naiyuan Liu, Xiaohan Wang, Xiaobo Li, Yi Yang, Yueting Zhuang

In this report, we present the ReLER@ZJU-Alibaba submission to the Ego4D Natural Language Queries (NLQ) Challenge in CVPR 2022.

Data Augmentation Natural Language Queries

Image Translation via Fine-grained Knowledge Transfer

1 code implementation21 Dec 2020 Xuanhong Chen, Ziang Liu, Ting Qiu, Bingbing Ni, Naiyuan Liu, XiWei Hu, Yuhan Li

Extensive experiments well demonstrate the effectiveness and feasibility of our framework in different image-translation tasks.

Retrieval Style Transfer +2

RainNet: A Large-Scale Imagery Dataset and Benchmark for Spatial Precipitation Downscaling

1 code implementation17 Dec 2020 Xuanhong Chen, Kairui Feng, Naiyuan Liu, Bingbing Ni, Yifan Lu, Zhengyan Tong, Ziang Liu

To alleviate these obstacles, we present the first large-scale spatial precipitation downscaling dataset named RainNet, which contains more than $62, 400$ pairs of high-quality low/high-resolution precipitation maps for over $17$ years, ready to help the evolution of deep learning models in precipitation downscaling.

CooGAN: A Memory-Efficient Framework for High-Resolution Facial Attribute Editing

1 code implementation ECCV 2020 Xuanhong Chen, Bingbing Ni, Naiyuan Liu, Ziang Liu, Yiliu Jiang, Loc Truong, Qi Tian

In contrast to great success of memory-consuming face editing methods at a low resolution, to manipulate high-resolution (HR) facial images, i. e., typically larger than 7682 pixels, with very limited memory is still challenging.

Attribute Image Generation +2

Anisotropic Stroke Control for Multiple Artists Style Transfer

1 code implementation16 Oct 2020 Xuanhong Chen, Xirui Yan, Naiyuan Liu, Ting Qiu, Bingbing Ni

Furthermore, the results are with distinctive artistic style and retain the anisotropic semantic information.

Style Transfer

Cannot find the paper you are looking for? You can Submit a new open access paper.