1 code implementation • 15 Mar 2023 • Saad Hamid, Xingchen Wan, Martin Jørgensen, Binxin Ru, Michael Osborne
Ensembling can improve the performance of Neural Networks, but existing approaches struggle when the architecture likelihood surface has dispersed, narrow peaks.
1 code implementation • NeurIPS 2021 • Xingchen Wan, Henry Kenlay, Robin Ru, Arno Blaas, Michael Osborne, Xiaowen Dong
While the majority of the literature focuses on such vulnerability in node-level classification tasks, little effort has been dedicated to analysing adversarial attacks on graph-level classification, an important problem with numerous real-life applications such as biochemistry and social network analysis.
no code implementations • ICML Workshop AML 2021 • Xingchen Wan, Henry Kenlay, Binxin Ru, Arno Blaas, Michael Osborne, Xiaowen Dong
Graph neural networks have been shown to be vulnerable to adversarial attacks.
no code implementations • NeurIPS 2021 • Cong Lu, Philip Ball, Jack Parker-Holder, Michael Osborne, S Roberts
Offline reinforcement learning enables agents to make use of large pre-collected datasets of environment transitions and learn control policies without the need for potentially expensive or unsafe online data collection.
1 code implementation • ICLR 2021 • Binxin Ru, Xingchen Wan, Xiaowen Dong, Michael Osborne
Our method optimises the architecture in a highly data-efficient manner: it is capable of capturing the topological structures of the architectures and is scalable to large graphs, thus making the high-dimensional and graph-like search spaces amenable to BO.
no code implementations • 19 Dec 2019 • Diego Granziol, Robin Ru, Stefan Zohren, Xiaowen Dong, Michael Osborne, Stephen Roberts
Graph spectral techniques for measuring graph similarity, or for learning the cluster number, require kernel smoothing.
4 code implementations • 1 Jul 2019 • Sebastian Farquhar, Michael Osborne, Yarin Gal
The Radial BNN is motivated by avoiding a sampling problem in 'mean-field' variational inference (MFVI) caused by the so-called 'soap-bubble' pathology of multivariate Gaussians.
no code implementations • 3 Jun 2019 • Diego Granziol, Binxin Ru, Stefan Zohren, Xiaowen Doing, Michael Osborne, Stephen Roberts
Efficient approximation lies at the heart of large-scale machine learning problems.
no code implementations • 25 Jan 2019 • Edward Wagstaff, Fabian B. Fuchs, Martin Engelcke, Ingmar Posner, Michael Osborne
Recent work on the representation of functions on sets has considered the use of summation in a latent space to enforce permutation invariance.
1 code implementation • 4 Dec 2018 • Ed Wagstaff, Saad Hamid, Michael Osborne
Integration over non-negative integrands is a central problem in machine learning (e. g. for model averaging, (hyper-)parameter marginalisation, and computing posterior predictive distributions).
1 code implementation • 25 Nov 2018 • Jack Fitzsimons, Michael Osborne, Stephen Roberts
Group fairness is an important concern for machine learning researchers, developers, and regulators.
no code implementations • 10 Oct 2018 • Jack Fitzsimons, AbdulRahman Al Ali, Michael Osborne, Stephen Roberts
Fairness, through its many forms and definitions, has become an important issue facing the machine learning community.
no code implementations • 18 Apr 2018 • Diego Granziol, Binxin Ru, Stefan Zohren, Xiaowen Dong, Michael Osborne, Stephen Roberts
Graph spectra have been successfully used to classify network types, compute the similarity between graphs, and determine the number of communities in a network.
no code implementations • 21 Feb 2018 • Diego Granziol, Edward Wagstaff, Bin Xin Ru, Michael Osborne, Stephen Roberts
Evaluating the log determinant of a positive definite matrix is ubiquitous in machine learning.
no code implementations • 12 Nov 2017 • Pengfei Zhang, Ido Nevat, Gareth W. Peters, Wolfgang Fruehwirt, Yongchao Huang, Ivonne Anders, Michael Osborne
Next, building on the S-BLUE, we address the second problem, and develop an efficient algorithm for query based sensor set selection with performance guarantee.
1 code implementation • 24 Apr 2017 • Jack Fitzsimons, Diego Granziol, Kurt Cutajar, Michael Osborne, Maurizio Filippone, Stephen Roberts
The scalable calculation of matrix determinants has been a bottleneck to the widespread application of many machine learning methods such as determinantal point processes, Gaussian processes, generalised Markov random fields, graph models and many others.
no code implementations • 5 Apr 2017 • Jack Fitzsimons, Kurt Cutajar, Michael Osborne, Stephen Roberts, Maurizio Filippone
The log-determinant of a kernel matrix appears in a variety of machine learning problems, ranging from determinantal point processes and generalized Markov random fields, through to the training of Gaussian processes.
no code implementations • 21 Oct 2015 • Javier González, Michael Osborne, Neil D. Lawrence
We present GLASSES: Global optimisation with Look-Ahead through Stochastic Simulation and Expected-loss Search.
no code implementations • 2 Jul 2015 • Steven Reece, Roman Garnett, Michael Osborne, Stephen Roberts
This paper proposes a novel Gaussian process approach to fault removal in time-series data.
no code implementations • 18 Mar 2014 • Nabeel Gillani, Rebecca Eynon, Michael Osborne, Isis Hjorth, Stephen Roberts
Massive Open Online Courses (MOOCs) bring together thousands of people from different geographies and demographic backgrounds -- but to date, little is known about how they learn or communicate.
no code implementations • 17 Feb 2014 • Jan-Peter Calliess, Michael Osborne, Stephen Roberts
Existing work in multi-agent collision prediction and avoidance typically assumes discrete-time trajectories with Gaussian uncertainty or that are completely deterministic.
no code implementations • NeurIPS 2012 • Michael Osborne, Roman Garnett, Zoubin Ghahramani, David K. Duvenaud, Stephen J. Roberts, Carl E. Rasmussen
Numerical integration is an key component of many problems in scientific computing, statistical modelling, and machine learning.