1 code implementation • 12 Mar 2024 • Peiyuan Liu, Hang Guo, Tao Dai, Naiqi Li, Jigang Bao, Xudong Ren, Yong Jiang, Shu-Tao Xia
Recently, with the surge of the Large Language Models (LLMs), several works have attempted to introduce LLMs into time series forecasting.
Knowledge Distillation Multivariate Time Series Forecasting +2
no code implementations • 19 Jan 2024 • Yujun Huang, Bin Chen, Naiqi Li, Baoyi An, Shu-Tao Xia, YaoWei Wang
In this paper, we propose a Measurement-Bounds-based Rate-Adaptive Image Compressed Sensing Network (MB-RACS) framework, which aims to adaptively determine the sampling rate for each image block in accordance with traditional measurement bounds theory.
1 code implementation • 20 Sep 2023 • Peiyuan Liu, Beiliang Wu, Naiqi Li, Tao Dai, Fengmao Lei, Jigang Bao, Yong Jiang, Shu-Tao Xia
In this paper, we propose a Wavelet-Fourier Transform Network (WFTNet) for long-term time series forecasting.
no code implementations • ICCV 2023 • Xinyi Zhang, Naiqi Li, Jiawei Li, Tao Dai, Yong Jiang, Shu-Tao Xia
Unsupervised surface anomaly detection aims at discovering and localizing anomalous patterns using only anomaly-free training samples.
no code implementations • 8 Aug 2022 • Jiawei Li, Chenxi Lan, Xinyi Zhang, Bolin Jiang, Yuqiu Xie, Naiqi Li, Yan Liu, Yaowei Li, Enze Huo, Bin Chen
To make a step forward, this paper outlines an automatic annotation system called SsaA, working in a self-supervised learning manner, for continuously making the online visual inspection in the manufacturing automation scenarios.
no code implementations • 29 Sep 2021 • Naiqi Li, Wenjie Li, Yong Jiang, Shu-Tao Xia
In this paper we propose the deep Dirichlet process mixture (DDPM) model, which is an unsupervised method that simultaneously performs clustering and feature learning.
1 code implementation • NeurIPS 2020 • Naiqi Li, Wenjie Li, Jifeng Sun, Yinghua Gao, Yong Jiang, Shu-Tao Xia
In this paper we propose Stochastic Deep Gaussian Processes over Graphs (DGPG), which are deep structure models that learn the mappings between input and output signals in graph domains.