Dimensionality Reduction
726 papers with code • 0 benchmarks • 10 datasets
Dimensionality reduction is the task of reducing the dimensionality of a dataset.
( Image credit: openTSNE )
Benchmarks
These leaderboards are used to track progress in Dimensionality Reduction
Libraries
Use these libraries to find Dimensionality Reduction models and implementationsDatasets
Most implemented papers
Reservoir computing approaches for representation and classification of multivariate time series
The architectures are compared to other MTS classifiers, including deep learning models and time series kernels.
Unsupervised Metric Learning in Presence of Missing Data
Here, we present a new algorithm MR-MISSING that extends these previous algorithms and can be used to compute low dimensional representation on data sets with missing entries.
CatBoost: gradient boosting with categorical features support
In this paper we present CatBoost, a new open-sourced gradient boosting library that successfully handles categorical features and outperforms existing publicly available implementations of gradient boosting in terms of quality on a set of popular publicly available datasets.
catch22: CAnonical Time-series CHaracteristics
Capturing the dynamical properties of time series concisely as interpretable feature vectors can enable efficient clustering and classification for time-series applications across science and industry.
Semantic Relatedness Based Re-ranker for Text Spotting
We present a scenario where semantic similarity is not enough, and we devise a neural approach to learn semantic relatedness.
Extracting the main trend in a dataset: the Sequencer algorithm
However, some are challenging to detect as they may be expressed in complex manners.
Improving the HardNet Descriptor
In the thesis we consider the problem of local feature descriptor learning for wide baseline stereo focusing on the HardNet descriptor, which is close to state-of-the-art.
M-ar-K-Fast Independent Component Analysis
This study presents the m-arcsinh Kernel ('m-ar-K') Fast Independent Component Analysis ('FastICA') method ('m-ar-K-FastICA') for feature extraction.
Wassmap: Wasserstein Isometric Mapping for Image Manifold Learning
In this paper, we propose Wasserstein Isometric Mapping (Wassmap), a nonlinear dimensionality reduction technique that provides solutions to some drawbacks in existing global nonlinear dimensionality reduction algorithms in imaging applications.
FiLM: Frequency improved Legendre Memory Model for Long-term Time Series Forecasting
Recent studies have shown that deep learning models such as RNNs and Transformers have brought significant performance gains for long-term forecasting of time series because they effectively utilize historical information.