Search Results for author: Mark Herbster

Found 17 papers, 2 papers with code

Multi-class Graph Clustering via Approximated Effective $p$-Resistance

1 code implementation14 Jun 2023 Shota Saito, Mark Herbster

We prove upper and lower bounds on this approximation and observe that it is exact when the graph is a tree.

Clustering Graph Clustering

Adversarial Online Collaborative Filtering

no code implementations11 Feb 2023 Stephen Pasteris, Fabio Vitale, Mark Herbster, Claudio Gentile, Andre' Panisson

We investigate the problem of online collaborative filtering under no-repetition constraints, whereby users need to be served content in an online fashion and a given user cannot be recommended the same content item more than once.

Collaborative Filtering

A Gang of Adversarial Bandits

no code implementations NeurIPS 2021 Mark Herbster, Stephen Pasteris, Fabio Vitale, Massimiliano Pontil

Users are in a social network and the learner is aided by a-priori knowledge of the strengths of the social links between all pairs of users.

Recommendation Systems

Improved Regret Bounds for Tracking Experts with Memory

no code implementations NeurIPS 2021 James Robinson, Mark Herbster

We address the problem of sequential prediction with expert advice in a non-stationary environment with long-term memory guarantees in the sense of Bousquet and Warmuth [4].

Portfolio Optimization

Online Multitask Learning with Long-Term Memory

no code implementations NeurIPS 2020 Mark Herbster, Stephen Pasteris, Lisa Tse

We provide an algorithm that predicts on each trial in time linear in the number of hypotheses when the hypothesis class is finite.

Online Learning of Facility Locations

no code implementations6 Jul 2020 Stephen Pasteris, Ting He, Fabio Vitale, Shiqiang Wang, Mark Herbster

In this paper, we provide a rigorous theoretical investigation of an online learning version of the Facility Location problem which is motivated by emerging problems in real-world applications.

Online Matrix Completion with Side Information

no code implementations NeurIPS 2020 Mark Herbster, Stephen Pasteris, Lisa Tse

In this setting, we provide an example where the side information is not directly specified in advance.

Matrix Completion

MaxHedge: Maximising a Maximum Online

no code implementations28 Oct 2018 Stephen Pasteris, Fabio Vitale, Kevin Chan, Shiqiang Wang, Mark Herbster

We introduce a new online learning framework where, at each trial, the learner is required to select a subset of actions from a given known action set.

Online Prediction of Switching Graph Labelings with Cluster Specialists

1 code implementation NeurIPS 2019 Mark Herbster, James Robinson

We address the problem of predicting the labeling of a graph in an online setting when the labeling is changing over time.

Quantum machine learning: a classical perspective

no code implementations26 Jul 2017 Carlo Ciliberto, Mark Herbster, Alessandro Davide Ialongo, Massimiliano Pontil, Andrea Rocchetto, Simone Severini, Leonard Wossnig

Recently, increased computational power and data availability, as well as algorithmic advances, have led machine learning techniques to impressive results in regression, classification, data-generation and reinforcement learning tasks.

BIG-bench Machine Learning Quantum Machine Learning

On Pairwise Clustering with Side Information

no code implementations19 Jun 2017 Stephen Pasteris, Fabio Vitale, Claudio Gentile, Mark Herbster

We measure performance not based on the recovery of the hidden similarity function, but instead on how well we classify each item.

Clustering Inductive Bias

The VC-Dimension of Similarity Hypotheses Spaces

no code implementations25 Feb 2015 Mark Herbster, Paul Rubenstein, James Townsend

Given a set $X$ and a function $h:X\longrightarrow\{0, 1\}$ which labels each element of $X$ with either $0$ or $1$, we may define a function $h^{(s)}$ to measure the similarity of pairs of points in $X$ according to $h$.

PAC learning

Fast Prediction on a Tree

no code implementations NeurIPS 2008 Mark Herbster, Massimiliano Pontil, Sergio R. Galeano

Given an $n$-vertex weighted tree with structural diameter $S$ and a subset of $m$ vertices, we present a technique to compute a corresponding $m \times m$ Gram matrix of the pseudoinverse of the graph Laplacian in $O(n+ m^2 + m S)$ time.

Online Prediction on Large Diameter Graphs

no code implementations NeurIPS 2008 Mark Herbster, Guy Lever, Massimiliano Pontil

Current on-line learning algorithms for predicting the labelling of a graph have an important limitation in the case of large diameter graphs; the number of mistakes made by such algorithms may be proportional to the square root of the number of vertices, even when tackling simple problems.

Cannot find the paper you are looking for? You can Submit a new open access paper.