no code implementations • NeurIPS 2023 • Ali Younis, Erik Sudderth
Particle filters flexibly represent multiple posterior modes nonparametrically, via a collection of weighted samples, but have classically been applied to tracking problems with known dynamics and observation likelihoods.
no code implementations • NeurIPS 2021 • Henry Bendekgey, Erik Sudderth
We investigate how fairness relaxations scale to flexible classifiers like deep neural networks for images and text.
no code implementations • NeurIPS 2017 • Daniel Milstein, Jason Pacheco, Leigh Hochberg, John D. Simeral, Beata Jarosiewicz, Erik Sudderth
We propose a dynamic Bayesian network that includes the on-screen goal position as part of its latent state, and thus allows the person’s intended angle of movement to be aggregated over a much longer history of neural activity.
1 code implementation • NeurIPS 2015 • Michael C. Hughes, William T. Stephenson, Erik Sudderth
Bayesian nonparametric hidden Markov models are typically learned via fixed truncations of the infinite state space or local Monte Carlo proposals that make small changes to the state space.
no code implementations • NeurIPS 2013 • Michael C. Hughes, Erik Sudderth
Variational inference algorithms provide the most effective framework for large-scale training of Bayesian nonparametric models.
no code implementations • NeurIPS 2013 • Dae Il Kim, Prem K. Gopalan, David Blei, Erik Sudderth
In large social networks, we expect entities to participate in multiple communities, and the number of communities to grow with the network size.