no code implementations • 16 Mar 2024 • Zhen Wang, Wenwen Min
Nonnegative Matrix Factorization (NMF) is a widely applied technique in the fields of machine learning and data mining.
no code implementations • 16 Mar 2024 • Xiaoyu Li, Wenwen Min, Shunfang Wang, Changmiao Wang, Taosheng Xu
Spatially resolved transcriptomics represents a significant advancement in single-cell analysis by offering both gene expression data and their corresponding physical locations.
1 code implementation • 26 Nov 2023 • Zhanghao Chen, Yifei Sun, Wenjian Qin, Ruiquan Ge, Cheng Pan, Wenming Deng, Zhou Liu, Wenwen Min, Ahmed Elazab, Xiang Wan, Changmiao Wang
As a remedial measure, bone suppression techniques have been introduced.
no code implementations • 17 Sep 2023 • Weifeng Yang, Wenwen Min
Multilinear logistic regression serves as a powerful tool for the analysis of multidimensional data.
1 code implementation • 23 Aug 2023 • Weifeng Yang, Wenwen Min
We propose an accelerated block proximal linear framework with adaptive momentum (ABPL$^+$) for nonconvex and nonsmooth optimization.
1 code implementation • 13 Aug 2023 • Wenwen Min, Taosheng Xu, Chris Ding
However, sPLS extracts the combinations between two data sets with all data samples so that it cannot detect latent subsets of samples.
no code implementations • 14 Mar 2023 • Jun Wan, Jun Liu, Jie zhou, Zhihui Lai, Linlin Shen, Hang Sun, Ping Xiong, Wenwen Min
Most facial landmark detection methods predict landmarks by mapping the input facial appearance features to landmark heatmaps and have achieved promising results.
1 code implementation • 27 Apr 2021 • Wenwen Min, Taosheng Xu, Xiang Wan, Tsung-Hui Chang
Non-negative matrix factorization (NMF) is a powerful tool for dimensionality reduction and clustering.
no code implementations • 28 Jul 2018 • Wenwen Min, Juan Liu, Shihua Zhang
We employ an alternating direction method of multipliers (ADMM) to solve the proximal operator.
no code implementations • 13 Oct 2017 • Wenwen Min, Juan Liu, Shihua Zhang
Given two data matrices $X$ and $Y$, sparse canonical correlation analysis (SCCA) is to seek two sparse canonical vectors $u$ and $v$ to maximize the correlation between $Xu$ and $Yv$.
no code implementations • 21 Sep 2016 • Wenwen Min, Juan Liu, Shihua Zhang
To address it, we introduce a novel network-regularized sparse LR model with a new penalty $\lambda \|\bm{w}\|_1 + \eta|\bm{w}|^T\bm{M}|\bm{w}|$ to consider the difference between the absolute values of the coefficients.
no code implementations • 19 Mar 2016 • Wenwen Min, Juan Liu, Shihua Zhang
Motivated by the development of sparse coding and graph-regularized norm, we propose a novel sparse graph-regularized SVD as a powerful biclustering tool for analyzing high-dimensional data.