no code implementations • 1 May 2019 • Colleen M. Farrelly, Srikanth Namuduri, Uchenna Chukwu
Further, by using an algorithm that superposes all possible distributions to collapse to fit a dataset, we optimize the model in a computationally efficient way.
no code implementations • 11 Oct 2017 • Colleen M. Farrelly
Ensemble learning has had many successes in supervised learning, but it has been rare in unsupervised learning and dimensionality reduction.
no code implementations • 21 Aug 2017 • Colleen M. Farrelly
This study compares various superlearner and deep learning architectures (machine-learning-based and neural-network-based) for classification problems across several simulated and industrial datasets to assess performance and computational efficiency, as both methods have nice theoretical convergence properties.
no code implementations • 17 Aug 2017 • Colleen M. Farrelly
It is possible that nonparametric methods, such as random forest or conditional inference trees, may provide better prediction and insight through modeling interaction terms and other nonlinear relationships between predictors and a given outcome.
no code implementations • 29 Jul 2017 • Colleen M. Farrelly
Results on simulations suggest gains from varying k above and beyond bagging features or samples, as well as the robustness of KNN ensembles to the curse of dimensionality.