no code implementations • 26 Jun 2018 • Sahar Qaadan, Tobias Glasmachers
Budgeted Stochastic Gradient Descent (BSGD) is a state-of-the-art technique for training large-scale kernelized support vector machines.
no code implementations • 26 Jun 2018 • Sahar Qaadan, Merlin Schüler, Tobias Glasmachers
We present a dual subspace ascent algorithm for support vector machine training that respects a budget constraint limiting the number of support vectors.
no code implementations • 26 Jun 2018 • Tobias Glasmachers, Sahar Qaadan
Limiting the model size of a kernel support vector machine to a pre-defined budget is a well-established technique that allows to scale SVM learning and prediction to large-scale data.