no code implementations • 14 Apr 2022 • Alex Glushkovsky
The article discusses a five-step approach: (1) segmentations of input features and the underlying variables of the metric that are supported by unsupervised autoencoders, (2) univariate or joint fittings of the metric by the aggregated input features on the segmented domains, (3) transformations of pre-screened input features according to the fitted models, (4) aggregation of the transformed features as time series, and (5) modelling of the metric time series as a sum of constrained linear effects of the aggregated features.
no code implementations • 29 Sep 2021 • Alex Glushkovsky
A beta variational autoencoder (beta-VAE) has been applied to represent trials of the initial full factorial design after filtering out unfeasible trials on the low dimensional latent space.
no code implementations • 24 Nov 2020 • Alex Glushkovsky
Applying that unsupervised learning for transposed data of electron configurations, the order of input variables that has been arranged by the encoder on the latent space has turned out to exactly match the sequence of Madelung's rule.
no code implementations • 6 Apr 2020 • Alex Glushkovsky
The latent space representation has been performed using an unsupervised beta variational autoencoder (beta-VAE).