no code implementations • 24 Feb 2024 • Chenrui Duan, Zelin Zang, Yongjie Xu, Hang He, Zihan Liu, Zijia Song, Ju-Sheng Zheng, Stan Z. Li
Metagenomic data, comprising mixed multi-species genomes, are prevalent in diverse environments like oceans and soils, significantly impacting human health and ecological functions.
1 code implementation • 20 Feb 2024 • Kai Wang, Zhaopan Xu, Yukun Zhou, Zelin Zang, Trevor Darrell, Zhuang Liu, Yang You
The autoencoder extracts latent representations of a subset of the trained network parameters.
1 code implementation • 15 Jan 2024 • Zelin Zang, Liangyu Li, Yongjie Xu, Chenrui Duan, Kai Wang, Yang You, Yi Sun, Stan Z. Li
MuST integrates the multi-modality information contained in the ST data effectively into a uniform latent space to provide a foundation for all the downstream tasks.
no code implementations • 12 Jan 2024 • Bozhen Hu, Zelin Zang, Jun Xia, Lirong Wu, Cheng Tan, Stan Z. Li
Representing graph data in a low-dimensional space for subsequent tasks is the purpose of attributed graph embedding.
no code implementations • 12 Jan 2024 • Bozhen Hu, Zelin Zang, Cheng Tan, Stan Z. Li
Protein representation learning is critical in various tasks in biology, such as drug design and protein structure or function prediction, which has primarily benefited from protein language models and graph neural networks.
no code implementations • 5 Jan 2024 • Ge Wang, Zelin Zang, Jiangbin Zheng, Jun Xia, Stan Z. Li
The mainstream method is utilizing contrastive learning to facilitate graph feature extraction, known as Graph Contrastive Learning (GCL).
no code implementations • 10 Sep 2023 • Zelin Zang, Hao Luo, Kai Wang, Panpan Zhang, Fan Wang, Stan. Z Li, Yang You
When applied to biological data, DiffAug improves performance by up to 10. 1%, with an average improvement of 5. 8%.
1 code implementation • ICCV 2023 • Zelin Zang, Lei Shang, Senqiao Yang, Fei Wang, Baigui Sun, Xuansong Xie, Stan Z. Li
The SCL loss weakens the adverse effects of the data augmentation view-noise problem which is amplified in domain transfer tasks.
Ranked #3 on Universal Domain Adaptation on Office-31
1 code implementation • 21 Nov 2022 • Zelin Zang, Shenghui Cheng, Linyan Lu, Hanchen Xia, Liangyu Li, Yaoting Sun, Yongjie Xu, Lei Shang, Baigui Sun, Stan Z. Li
The proposed techniques are integrated with a visual interface to help the user to adjust EVNet to achieve better DR performance and explainability.
no code implementations • 21 Nov 2022 • Zelin Zang, Lei Shang, Senqiao Yang, Fei Wang, Baigui Sun, Xuansong Xie, Stan Z. Li
The SCL loss weakens the adverse effects of the data augmentation view-noise problem which is amplified in domain transfer tasks.
no code implementations • 8 Jul 2022 • Zelin Zang, Yongjie Xu, Linyan Lu, Yulan Geng, Senqiao Yang, Stan Z. Li
We propose that the ideal DR approach combines both FS and FP into a unified end-to-end manifold learning framework, simultaneously performing fundamental feature discovery while maintaining the intrinsic relationships between data samples in the latent space.
2 code implementations • 7 Jul 2022 • Zelin Zang, Siyuan Li, Di wu, Ge Wang, Lei Shang, Baigui Sun, Hao Li, Stan Z. Li
To overcome the underconstrained embedding problem, we design a loss and theoretically demonstrate that it leads to a more suitable embedding based on the local flatness.
Ranked #2 on Image Classification on ImageNet-100
3 code implementations • 27 May 2022 • Siyuan Li, Di wu, Fang Wu, Zelin Zang, Stan. Z. Li
We then propose an Architecture-Agnostic Masked Image Modeling framework (A$^2$MIM), which is compatible with both Transformers and CNNs in a unified way.
1 code implementation • 27 Oct 2021 • Siyuan Li, Zicheng Liu, Zelin Zang, Di wu, ZhiYuan Chen, Stan Z. Li
For example, dimension reduction methods, t-SNE, and UMAP optimize pair-wise data relationships by preserving the global geometric structure, while self-supervised learning, SimCLR, and BYOL focus on mining the local statistics of instances under specific augmentations.
no code implementations • 20 Oct 2021 • Zihan Liu, Yun Luo, Zelin Zang, Stan Z. Li
Gray-box graph attacks aim at disrupting the performance of the victim model by using inconspicuous attacks with limited knowledge of the victim model.
1 code implementation • 30 Jun 2021 • Di wu, Siyuan Li, Zelin Zang, Stan Z. Li
Self-supervised contrastive learning has demonstrated great potential in learning visual representations.
Ranked #22 on Fine-Grained Image Classification on NABirds
1 code implementation • 27 Apr 2021 • Zelin Zang, Siyuan Li, Di wu, Jianzhu Guo, Yongjie Xu, Stan Z. Li
Unsupervised attributed graph representation learning is challenging since both structural and feature information are required to be represented in the latent space.
Ranked #2 on Node Clustering on Pubmed
no code implementations • 1 Jan 2021 • Stan Z. Li, Zelin Zang, Lirong Wu
The ability to preserve local geometry of highly nonlinear manifolds in high dimensional spaces and properly unfold them into lower dimensional hyperplanes is the key to the success of manifold computing, nonlinear dimensionality reduction (NLDR) and visualization.
no code implementations • 1 Dec 2020 • Stan Z. Li, Lirong Wu, Zelin Zang
In this paper, we propose a novel neural network-based method, called Consistent Representation Learning (CRL), to accomplish the three associated tasks end-to-end and improve the consistencies.
no code implementations • 28 Oct 2020 • Stan Z. Li, Zelin Zang, Lirong Wu
The LGP constraints constitute the loss for deep manifold learning and serve as geometric regularizers for NLDR network training.
1 code implementation • 7 Oct 2020 • Siyuan Li, Haitao Lin, Zelin Zang, Lirong Wu, Jun Xia, Stan Z. Li
Dimension reduction (DR) aims to learn low-dimensional representations of high-dimensional data with the preservation of essential information.
no code implementations • 28 Sep 2020 • Lirong Wu, Zicheng Liu, Zelin Zang, Jun Xia, Siyuan Li, Stan Z. Li
To overcome the problem that clusteringoriented losses may deteriorate the geometric structure of embeddings in the latent space, an isometric loss is proposed for preserving intra-manifold structure locally and a ranking loss for inter-manifold structure globally.
1 code implementation • 21 Sep 2020 • Lirong Wu, Zicheng Liu, Zelin Zang, Jun Xia, Siyuan Li, Stan Z. Li
Though manifold-based clustering has become a popular research topic, we observe that one important factor has been omitted by these works, namely that the defined clustering loss may corrupt the local and global structure of the latent space.
2 code implementations • 15 Jun 2020 • Stan Z. Li, Zelin Zang, Lirong Wu
We propose a novel framework, called Markov-Lipschitz deep learning (MLDL), to tackle geometric deterioration caused by collapse, twisting, or crossing in vector-based neural network transformations for manifold-based representation learning and manifold data generation.
no code implementations • 2 Mar 2020 • Linyan Lu, Zhaohui Yang, Mingzhe Chen, Zelin Zang, Mohammad Shikh-Bahaei
In this paper, a machine learning based deployment framework of unmanned aerial vehicles (UAVs) is studied.