Search Results for author: Tyler Farghly

Found 5 papers, 1 papers with code

Towards a Complete Analysis of Langevin Monte Carlo: Beyond Poincaré Inequality

no code implementations7 Mar 2023 Alireza Mousavi-Hosseini, Tyler Farghly, Ye He, Krishnakumar Balasubramanian, Murat A. Erdogdu

We do so by establishing upper and lower bounds for Langevin diffusions and LMC under weak Poincar\'e inequalities that are satisfied by a large class of densities including polynomially-decaying heavy-tailed densities (i. e., Cauchy-type).

Mean-Square Analysis of Discretized Itô Diffusions for Heavy-tailed Sampling

no code implementations1 Mar 2023 Ye He, Tyler Farghly, Krishnakumar Balasubramanian, Murat A. Erdogdu

We analyze the complexity of sampling from a class of heavy-tailed distributions by discretizing a natural class of It\^o diffusions associated with weighted Poincar\'e inequalities.

Generalisation under gradient descent via deterministic PAC-Bayes

no code implementations6 Sep 2022 Eugenio Clerico, Tyler Farghly, George Deligiannidis, Benjamin Guedj, Arnaud Doucet

We establish disintegrated PAC-Bayesian generalisation bounds for models trained with gradient descent methods or continuous gradient flows.

Time-independent Generalization Bounds for SGLD in Non-convex Settings

no code implementations NeurIPS 2021 Tyler Farghly, Patrick Rebeschini

We establish generalization error bounds for stochastic gradient Langevin dynamics (SGLD) with constant learning rate under the assumptions of dissipativity and smoothness, a setting that has received increased attention in the sampling/optimization literature.

Generalization Bounds

Cannot find the paper you are looking for? You can Submit a new open access paper.