Dimensionality Reduction

726 papers with code • 0 benchmarks • 10 datasets

Dimensionality reduction is the task of reducing the dimensionality of a dataset.

( Image credit: openTSNE )

Libraries

Use these libraries to find Dimensionality Reduction models and implementations

Most implemented papers

Reservoir computing approaches for representation and classification of multivariate time series

FilippoMB/Reservoir-model-space-classifier 21 Mar 2018

The architectures are compared to other MTS classifiers, including deep learning models and time series kernels.

Unsupervised Metric Learning in Presence of Missing Data

rsonthal/MRMissing.jl 19 Jul 2018

Here, we present a new algorithm MR-MISSING that extends these previous algorithms and can be used to compute low dimensional representation on data sets with missing entries.

CatBoost: gradient boosting with categorical features support

catboost/catboost 24 Oct 2018

In this paper we present CatBoost, a new open-sourced gradient boosting library that successfully handles categorical features and outperforms existing publicly available implementations of gradient boosting in terms of quality on a set of popular publicly available datasets.

catch22: CAnonical Time-series CHaracteristics

chlubba/catch22 29 Jan 2019

Capturing the dynamical properties of time series concisely as interpretable feature vectors can enable efficient clustering and classification for time-series applications across science and industry.

Semantic Relatedness Based Re-ranker for Text Spotting

ahmedssabir/Semantic-Relatedness-Based-Reranker-for-Text-Spotting IJCNLP 2019

We present a scenario where semantic similarity is not enough, and we devise a neural approach to learn semantic relatedness.

Extracting the main trend in a dataset: the Sequencer algorithm

dalya/Sequencer 24 Jun 2020

However, some are challenging to detect as they may be expressed in complex manners.

Improving the HardNet Descriptor

pultarmi/HardNet_MultiDataset 19 Jul 2020

In the thesis we consider the problem of local feature descriptor learning for wide baseline stereo focusing on the HardNet descriptor, which is close to state-of-the-art.

M-ar-K-Fast Independent Component Analysis

luca-parisi/m-arcsinh_scikit-learn_TensorFlow_Keras 17 Aug 2021

This study presents the m-arcsinh Kernel ('m-ar-K') Fast Independent Component Analysis ('FastICA') method ('m-ar-K-FastICA') for feature extraction.

Wassmap: Wasserstein Isometric Mapping for Image Manifold Learning

keatonhamm/wassmap 13 Apr 2022

In this paper, we propose Wasserstein Isometric Mapping (Wassmap), a nonlinear dimensionality reduction technique that provides solutions to some drawbacks in existing global nonlinear dimensionality reduction algorithms in imaging applications.

FiLM: Frequency improved Legendre Memory Model for Long-term Time Series Forecasting

tianzhou2011/FiLM 18 May 2022

Recent studies have shown that deep learning models such as RNNs and Transformers have brought significant performance gains for long-term forecasting of time series because they effectively utilize historical information.