no code implementations • COLING 2022 • Rui Li, Cheng Liu, Dazhi Jiang
Recently, fine-tuning the pre-trained language model (PrLM) on labeled sentiment datasets demonstrates impressive performance.
no code implementations • 21 Dec 2023 • Qing Zhang, Cheng Liu, Bo Liu, Haitong Huang, Ying Wang, Huawei Li, Xiaowei Li
Fault-tolerant deep learning accelerator is the basis for highly reliable deep learning processing and critical to deploy deep learning in safety-critical applications such as avionics and robotics.
no code implementations • 6 Nov 2023 • Siyi Zhang, Cheng Liu, Xiang Li, Xin Zhai, Zhen Wei, Sizhe Li, Xun Ma
The current trend of automating inspections at substations has sparked a surge in interest in the field of transformer image recognition.
no code implementations • 16 Aug 2023 • Xinghua Xue, Cheng Liu, Bo Liu, Haitong Huang, Ying Wang, Tao Luo, Lei Zhang, Huawei Li, Xiaowei Li
When it is applied on fault-tolerant neural networks enhanced with fault-aware retraining and constrained activation functions, the resulting model accuracy generally shows significant improvement in presence of various faults.
1 code implementation • 15 Aug 2023 • Akshaj Gaur, Cheng Liu, Xiaomin Lin, Nare Karapetyan, Yiannis Aloimonos
With a number of marine populations in rapid decline, collecting and analyzing data about marine populations has become increasingly important to develop effective conservation policies for a wide range of marine animals, including whales.
no code implementations • 8 Aug 2023 • Yikun Liu, Yuning Wang, Cheng Liu
Accurate detection of natural deterioration and man-made damage on the surfaces of ancient stele in the first instance is essential for their preventive conservation.
no code implementations • 20 Jun 2023 • Haitong Huang, Cheng Liu
The reliability of deep learning accelerators (DLAs) used in autonomous driving systems has significant impact on the system safety.
1 code implementation • 20 Jun 2023 • Haitong Huang, Cheng Liu, Bo Liu, Xinghua Xue, Huawei Li, Xiaowei Li
It enables users to modify an independent fault configuration file rather than neural network models for the fault injection and vulnerability analysis.
no code implementations • 20 Feb 2023 • Hao Lv, Bing Li, Lei Zhang, Cheng Liu, Ying Wang
The RRAM-based neuromorphic computing system has amassed explosive interests for its superior data processing capability and energy efficiency than traditional architectures, and thus being widely used in many data-centric applications.
no code implementations • CVPR 2023 • Xiwen Wei, Zhen Xu, Cheng Liu, Si Wu, Zhiwen Yu, Hau San Wong
To address this limitation, we propose a Text-guided Unsupervised StyleGAN Latent Transformation (TUSLT) model, which adaptively infers a single transformation step in the latent space of StyleGAN to simultaneously manipulate multiple attributes on a given input image.
1 code implementation • 26 Oct 2022 • Xiaomin Lin, Cheng Liu, Allen Pattillo, Miao Yu, Yiannis Aloimonous
To this end, we present a new benchmark suite, SeaDroneSim, that can be used to create photo-realistic aerial image datasets with the ground truth for segmentation masks of any given object.
no code implementations • 12 Oct 2022 • Haitong Huang, Xinghua Xue, Cheng Liu, Ying Wang, Tao Luo, Long Cheng, Huawei Li, Xiaowei Li
Prior work mainly rely on fault simulation to analyze the influence of soft errors on NN processing.
no code implementations • 5 Apr 2022 • Cheng Liu, Zhen Gao, Siting Liu, Xuefei Ning, Huawei Li, Xiaowei Li
With the rapid advancements of deep learning in the past decade, it can be foreseen that deep learning will be continuously deployed in more and more safety-critical applications such as autonomous driving and robotics.
1 code implementation • 28 Mar 2022 • Cheng Liu, Erik-Jan van Kampen, Guido C. H. E. de Croon
Enabling the capability of assessing risk and making risk-aware decisions is essential to applying reinforcement learning to safety-critical robots like drones.
no code implementations • 17 Feb 2022 • Xinghua Xue, Haitong Huang, Cheng Liu, Ying Wang, Tao Luo, Lei Zhang
Winograd convolution is originally proposed to reduce the computing overhead by converting multiplication in neural network (NN) with addition via linear transformation.
1 code implementation • 6 Dec 2021 • Yuanjie Gu, Zhibo Xiao, Yinghan Guan, Haoran Dai, Cheng Liu, Liang Xue, Shouyu Wang
In this study, we show that, the structures of generative networks capture a great deal of image feature priors, and then these priors are sufficient to reconstruct high-quality fused super-resolution result using only low-resolution inputs.
1 code implementation • 12 Oct 2021 • Yuanjie Gu, Yinghan Guan, Zhibo Xiao, Haoran Dai, Cheng Liu, Shouyu Wang
Multi-focus image fusion (MFIF) and super-resolution (SR) are the inverse problem of imaging model, purposes of MFIF and SR are obtaining all-in-focus and high-resolution 2D mapping of targets.
no code implementations • 7 Jul 2021 • Dawen Xu, Meng He, Cheng Liu, Ying Wang, Long Cheng, Huawei Li, Xiaowei Li, Kwang-Ting Cheng
It takes the remote AIoT processor with soft errors in the training loop such that the on-site computing errors can be learned with the application data on the server and the retrained models can be resilient to the soft errors.
no code implementations • ICLR 2020 • Xiandong Zhao, Ying Wang, Xuyi Cai, Cheng Liu, Lei Zhang
With the proliferation of specialized neural network processors that operate on low-precision integers, the performance of Deep Neural Network inference becomes increasingly dependent on the result of quantization.
2 code implementations • 3 Jul 2019 • Dawen Xu, Ying Wang, Kaijie Tu, Cheng Liu, Bingsheng He, Lei Zhang
Generative neural network is a new category of neural networks and it has been widely utilized in applications such as content generation, unsupervised learning, segmentation and pose estimation.
no code implementations • 26 Feb 2019 • Chuangyi Gui, Long Zheng, Bingsheng He, Cheng Liu, Xinyu Chen, Xiaofei Liao, Hai Jin
Graph is a well known data structure to represent the associated relationships in a variety of applications, e. g., data science and machine learning.
Distributed, Parallel, and Cluster Computing