no code implementations • 4 Feb 2024 • Junhua Zeng, Guoxu Zhou, Chao Li, Zhun Sun, Qibin Zhao
Tensor network structure search (TN-SS), aiming at searching for suitable tensor network (TN) structures in representing high-dimensional problems, largely promotes the efficacy of TN in various machine learning applications.
no code implementations • 3 Jul 2023 • Qi Jiang, Guoxu Zhou, Qibin Zhao
Concept Factorization (CF), as a novel paradigm of representation learning, has demonstrated superior performance in multi-view clustering tasks.
1 code implementation • NeurIPS 2023 • Andong Wang, Chao Li, Mingyuan Bai, Zhong Jin, Guoxu Zhou, Qibin Zhao
Our analysis indicates that the transformed low-rank parameterization can promisingly enhance robust generalization for t-NNs.
no code implementations • 27 Nov 2022 • Yichun Qiu, Weijun Sun, Guoxu Zhou, Qibin Zhao
Efficient and accurate low-rank approximation (LRA) methods are of great significance for large-scale data analysis.
no code implementations • 7 Oct 2022 • Peilin Yang, Weijun Sun, Qibin Zhao, Guoxu Zhou
The prevalent fully-connected tensor network (FCTN) has achieved excellent success to compress data.
no code implementations • 4 Apr 2022 • Peilin Yang, Yonghui Huang, Yuning Qiu, Weijun Sun, Guoxu Zhou
The algorithm performs a composition of the completed tensor by initialising the factors from the FCTN decomposition.
no code implementations • 14 Mar 2022 • Yuning Qiu, Guoxu Zhou, Qibin Zhao, Shengli Xie
Experimental results on both synthetic and real-world data demonstrate the effectiveness and efficiency of the proposed model in recovering noisy incomplete tensor data compared with state-of-the-art tensor completion models.
no code implementations • 27 Feb 2022 • Zhenhao Huang, Yuning Qiu, Xinqi Chen, Weijun Sun, Guoxu Zhou
Robust tensor completion (RTC) aims to recover a low-rank tensor from its incomplete observation with outlier corruption.
no code implementations • 3 Jan 2022 • Yuyuan Yu, Guoxu Zhou, Haonan Huang, Shengli Xie, Qibin Zhao
However, existing strategies cannot take advantage of semi-supervised information, only distinguishing the importance of views from a data feature perspective, which is often influenced by low-quality views then leading to poor performance.
1 code implementation • 19 Oct 2021 • Tenghui Li, Guoxu Zhou, Yuning Qiu, Qibin Zhao
We make an attempt to understanding convolutional neural network by exploring the relationship between (deep) convolutional neural networks and Volterra convolutions.
no code implementations • 6 Sep 2021 • Xinhai Zhao, Yuyuan Yu, Guoxu Zhou, Qibin Zhao, Weijun Sun
For the high dimensional data representation, nonnegative tensor ring (NTR) decomposition equipped with manifold learning has become a promising model to exploit the multi-dimensional structure and extract the feature from tensor data.
no code implementations • 12 Dec 2020 • Yu Zhang, Tao Zhou, Wei Wu, Hua Xie, Hongru Zhu, Guoxu Zhou, Andrzej Cichocki
With the encoded label matrix, we devise a novel multi-task learning algorithm by exploiting the subclass relationship to jointly optimize the EEG pattern features from the uncovered subclasses.
no code implementations • 12 Oct 2020 • Yuyuan Yu, Guoxu Zhou, Ning Zheng, Shengli Xie, Qibin Zhao
Tensor ring (TR) decomposition is a powerful tool for exploiting the low-rank nature of multiway data and has demonstrated great potential in a variety of important applications.
no code implementations • ICLR 2019 • Xinqi Chen, Ming Hou, Guoxu Zhou, Qibin Zhao
Recent deep multi-task learning (MTL) has been witnessed its success in alleviating data scarcity of some task by utilizing domain-specific knowledge from related tasks.
no code implementations • 21 Mar 2019 • Jinshi Yu, Chao Li, Qibin Zhao, Guoxu Zhou
Tensor ring (TR) decomposition has been successfully used to obtain the state-of-the-art performance in the visual data completion problem.
no code implementations • 20 Mar 2018 • Jinshi Yu, Guoxu Zhou, Andrzej Cichocki, Shengli Xie
Nonsmooth Nonnegative Matrix Factorization (nsNMF) is capable of producing more localized, less overlapped feature representations than other variants of NMF while keeping satisfactory fit to data.
no code implementations • 20 Nov 2017 • Yuning Qiu, Guoxu Zhou, Kan Xie
Nonnegative Matrix Factorization (NMF) is a widely used technique for data representation.
1 code implementation • 17 Jun 2016 • Qibin Zhao, Guoxu Zhou, Shengli Xie, Liqing Zhang, Andrzej Cichocki
In this paper, we introduce a fundamental tensor decomposition model to represent a large dimensional tensor by a circular multilinear products over a sequence of low dimensional cores, which can be graphically interpreted as a cyclic interconnection of 3rd-order tensors, and thus termed as tensor ring (TR) decomposition.
no code implementations • 29 Aug 2015 • Guoxu Zhou, Qibin Zhao, Yu Zhang, Tülay Adalı, Shengli Xie, Andrzej Cichocki
With the increasing availability of various sensor technologies, we now have access to large amounts of multi-block (also called multi-set, multi-relational, or multi-view) data that need to be jointly analyzed to explore their latent connections.
no code implementations • 9 Oct 2014 • Qibin Zhao, Guoxu Zhou, Liqing Zhang, Andrzej Cichocki, Shun-ichi Amari
We propose a generative model for robust tensor factorization in the presence of both missing data and outliers.
no code implementations • 17 Apr 2014 • Guoxu Zhou, Andrzej Cichocki, Qibin Zhao, Shengli Xie
Nonnegative Tucker decomposition (NTD) is a powerful tool for the extraction of nonnegative parts-based and physically meaningful latent components from high-dimensional tensor data while preserving the natural multilinear structure of data.
no code implementations • 26 Aug 2013 • Yu Zhang, Guoxu Zhou, Jing Jin, Xingyu Wang, Andrzej Cichocki
Canonical correlation analysis (CCA) has been one of the most popular methods for frequency recognition in steady-state visual evoked potential (SSVEP)-based brain-computer interfaces (BCIs).
no code implementations • 17 Dec 2012 • Guoxu Zhou, Andrzej Cichocki, Yu Zhang, Danilo Mandic
Very often data we encounter in practice is a collection of matrices rather than a single matrix.
no code implementations • 15 Nov 2012 • Guoxu Zhou, Andrzej Cichocki, Shengli Xie
Canonical Polyadic (or CANDECOMP/PARAFAC, CP) decompositions (CPD) are widely applied to analyze high order tensors.