no code implementations • 22 Feb 2024 • Stephen Pasteris, Alberto Rumi, Maximilian Thiessen, Shota Saito, Atsushi Miyauchi, Fabio Vitale, Mark Herbster
We study the classic problem of prediction with expert advice under bandit feedback.
1 code implementation • 14 Jun 2023 • Shota Saito, Mark Herbster
We prove upper and lower bounds on this approximation and observe that it is exact when the graph is a tree.
no code implementations • 11 Feb 2023 • Stephen Pasteris, Fabio Vitale, Mark Herbster, Claudio Gentile, Andre' Panisson
We investigate the problem of online collaborative filtering under no-repetition constraints, whereby users need to be served content in an online fashion and a given user cannot be recommended the same content item more than once.
no code implementations • NeurIPS 2021 • Mark Herbster, Stephen Pasteris, Fabio Vitale, Massimiliano Pontil
Users are in a social network and the learner is aided by a-priori knowledge of the strengths of the social links between all pairs of users.
no code implementations • NeurIPS 2021 • James Robinson, Mark Herbster
We address the problem of sequential prediction with expert advice in a non-stationary environment with long-term memory guarantees in the sense of Bousquet and Warmuth [4].
no code implementations • NeurIPS 2020 • Mark Herbster, Stephen Pasteris, Lisa Tse
We provide an algorithm that predicts on each trial in time linear in the number of hypotheses when the hypothesis class is finite.
no code implementations • 6 Jul 2020 • Stephen Pasteris, Ting He, Fabio Vitale, Shiqiang Wang, Mark Herbster
In this paper, we provide a rigorous theoretical investigation of an online learning version of the Facility Location problem which is motivated by emerging problems in real-world applications.
no code implementations • NeurIPS 2020 • Mark Herbster, Stephen Pasteris, Lisa Tse
In this setting, we provide an example where the side information is not directly specified in advance.
no code implementations • 28 Oct 2018 • Stephen Pasteris, Fabio Vitale, Kevin Chan, Shiqiang Wang, Mark Herbster
We introduce a new online learning framework where, at each trial, the learner is required to select a subset of actions from a given known action set.
1 code implementation • NeurIPS 2019 • Mark Herbster, James Robinson
We address the problem of predicting the labeling of a graph in an online setting when the labeling is changing over time.
no code implementations • 26 Jul 2017 • Carlo Ciliberto, Mark Herbster, Alessandro Davide Ialongo, Massimiliano Pontil, Andrea Rocchetto, Simone Severini, Leonard Wossnig
Recently, increased computational power and data availability, as well as algorithmic advances, have led machine learning techniques to impressive results in regression, classification, data-generation and reinforcement learning tasks.
no code implementations • 19 Jun 2017 • Stephen Pasteris, Fabio Vitale, Claudio Gentile, Mark Herbster
We measure performance not based on the recovery of the hidden similarity function, but instead on how well we classify each item.
no code implementations • NeurIPS 2016 • Mark Herbster, Stephen Pasteris, Massimiliano Pontil
We study the problem of completing a binary matrix in an online learning setting.
no code implementations • NeurIPS 2015 • Mark Herbster, Stephen Pasteris, Shaona Ghosh
We design an online algorithm to classify the vertices of a graph.
no code implementations • 25 Feb 2015 • Mark Herbster, Paul Rubenstein, James Townsend
Given a set $X$ and a function $h:X\longrightarrow\{0, 1\}$ which labels each element of $X$ with either $0$ or $1$, we may define a function $h^{(s)}$ to measure the similarity of pairs of points in $X$ according to $h$.
no code implementations • NeurIPS 2008 • Mark Herbster, Massimiliano Pontil, Sergio R. Galeano
Given an $n$-vertex weighted tree with structural diameter $S$ and a subset of $m$ vertices, we present a technique to compute a corresponding $m \times m$ Gram matrix of the pseudoinverse of the graph Laplacian in $O(n+ m^2 + m S)$ time.
no code implementations • NeurIPS 2008 • Mark Herbster, Guy Lever, Massimiliano Pontil
Current on-line learning algorithms for predicting the labelling of a graph have an important limitation in the case of large diameter graphs; the number of mistakes made by such algorithms may be proportional to the square root of the number of vertices, even when tackling simple problems.