1 code implementation • 18 Oct 2023 • Pascal Kündig, Fabio Sigrist
Latent Gaussian process (GP) models are flexible probabilistic non-parametric function models.
1 code implementation • 5 Jul 2023 • Fabio Sigrist
High-cardinality categorical variables are variables for which the number of different levels is large relative to the sample size of a data set, or in other words, there are few data points per level.
1 code implementation • 19 May 2021 • Fabio Sigrist
Latent Gaussian models and boosting are widely used techniques in statistics and machine learning.
no code implementations • 6 Jan 2021 • Jakob A. Dambon, Fabio Sigrist, Reinhard Furrer
It relies on a penalized maximum likelihood estimation (PMLE) and allows variable selection both with respect to fixed effects and Gaussian process random effects.
Variable Selection Methodology
1 code implementation • 6 Apr 2020 • Fabio Sigrist
We introduce a novel way to combine boosting with Gaussian process and mixed effects models.
1 code implementation • 11 Feb 2019 • Fabio Sigrist
We introduce a novel boosting algorithm called `KTBoost' which combines kernel boosting and tree boosting.
2 code implementations • 9 Aug 2018 • Fabio Sigrist
In addition, we introduce a novel tuning parameter for tree-based Newton boosting which is interpretable and important for predictive accuracy.
2 code implementations • 23 Nov 2017 • Fabio Sigrist, Christoph Hirnschall
A frequent problem in binary classification is class imbalance between a minority and a majority class such as defaults and non-defaults in default prediction.
Methodology