Dimensionality Reduction
725 papers with code • 0 benchmarks • 10 datasets
Dimensionality reduction is the task of reducing the dimensionality of a dataset.
( Image credit: openTSNE )
Benchmarks
These leaderboards are used to track progress in Dimensionality Reduction
Libraries
Use these libraries to find Dimensionality Reduction models and implementationsDatasets
Latest papers
Cross-Temporal Spectrogram Autoencoder (CTSAE): Unsupervised Dimensionality Reduction for Clustering Gravitational Wave Glitches
In response to this challenge, we introduce the Cross-Temporal Spectrogram Autoencoder (CTSAE), a pioneering unsupervised method for the dimensionality reduction and clustering of gravitational wave glitches.
Distributional Principal Autoencoders
Dimension reduction techniques usually lose information in the sense that reconstructed data are not identical to the original data.
Quiver Laplacians and Feature Selection
The challenge of selecting the most relevant features of a given dataset arises ubiquitously in data analysis and dimensionality reduction.
scCDCG: Efficient Deep Structural Clustering for single-cell RNA-seq via Deep Cut-informed Graph Embedding
Addressing these limitations, we introduce scCDCG (single-cell RNA-seq Clustering via Deep Cut-informed Graph), a novel framework designed for efficient and accurate clustering of scRNA-seq data that simultaneously utilizes intercellular high-order structural information.
Remote sensing framework for geological mapping via stacked autoencoders and clustering
In this study, we present an unsupervised machine learning framework for processing remote sensing data by utilizing stacked autoencoders for dimensionality reduction and k-means clustering for mapping geological units.
DMSSN: Distilled Mixed Spectral-Spatial Network for Hyperspectral Salient Object Detection
To address these challenges, we propose a novel approach termed the Distilled Mixed Spectral-Spatial Network (DMSSN), comprising a Distilled Spectral Encoding process and a Mixed Spectral-Spatial Transformer (MSST) feature extraction network.
Enhancing Dimension-Reduced Scatter Plots with Class and Feature Centroids
We illustrate the utility of this approach with data derived from the phenotypes of three neurogenetic diseases and demonstrate how the addition of class and feature centroids increases the interpretability of scatter plots.
Efficient Algorithms for Regularized Nonnegative Scale-invariant Low-rank Approximation Models
However, from a practical perspective, the choice of regularizers and regularization coefficients, as well as the design of efficient algorithms, is challenging because of the multifactor nature of these models and the lack of theory to back these choices.
Targeted Visualization of the Backbone of Encoder LLMs
Attention based Large Language Models (LLMs) are the state-of-the-art in natural language processing (NLP).
S+t-SNE - Bringing dimensionality reduction to data streams
We present S+t-SNE, an adaptation of the t-SNE algorithm designed to handle infinite data streams.