no code implementations • 7 Mar 2023 • Alireza Mousavi-Hosseini, Tyler Farghly, Ye He, Krishnakumar Balasubramanian, Murat A. Erdogdu
We do so by establishing upper and lower bounds for Langevin diffusions and LMC under weak Poincar\'e inequalities that are satisfied by a large class of densities including polynomially-decaying heavy-tailed densities (i. e., Cauchy-type).
no code implementations • 1 Mar 2023 • Ye He, Tyler Farghly, Krishnakumar Balasubramanian, Murat A. Erdogdu
We analyze the complexity of sampling from a class of heavy-tailed distributions by discretizing a natural class of It\^o diffusions associated with weighted Poincar\'e inequalities.
no code implementations • 6 Sep 2022 • Eugenio Clerico, Tyler Farghly, George Deligiannidis, Benjamin Guedj, Arnaud Doucet
We establish disintegrated PAC-Bayesian generalisation bounds for models trained with gradient descent methods or continuous gradient flows.
no code implementations • NeurIPS 2021 • Tyler Farghly, Patrick Rebeschini
We establish generalization error bounds for stochastic gradient Langevin dynamics (SGLD) with constant learning rate under the assumptions of dissipativity and smoothness, a setting that has received increased attention in the sampling/optimization literature.
1 code implementation • ACL 2020 • Sam Coope, Tyler Farghly, Daniela Gerz, Ivan Vulić, Matthew Henderson
We introduce Span-ConveRT, a light-weight model for dialog slot-filling which frames the task as a turn-based span extraction task.