1 code implementation • 6 Feb 2024 • Matthew Ho, Deaglan J. Bartlett, Nicolas Chartier, Carolina Cuesta-Lazaro, Simon Ding, Axel Lapel, Pablo Lemos, Christopher C. Lovell, T. Lucas Makinen, Chirag Modi, Viraj Pandya, Shivam Pandey, Lucia A. Perez, Benjamin Wandelt, Greg L. Bryan
This paper presents the Learning the Universe Implicit Likelihood Inference (LtU-ILI) pipeline, a codebase for rapid, user-friendly, and cutting-edge machine learning (ML) inference in astrophysics and cosmology.
no code implementations • 18 May 2023 • Matthew Ho, Xiaosheng Zhao, Benjamin Wandelt
We present the information-ordered bottleneck (IOB), a neural layer designed to adaptively compress data into latent variables ordered by likelihood maximization.
no code implementations • 21 Oct 2022 • Matthew Ho, Aditya Sharma, Justin Chang, Michael Saxon, Sharon Levy, Yujie Lu, William Yang Wang
As large language models (LLMs) grow larger and more sophisticated, assessing their "reasoning" capabilities in natural language grows more challenging.
1 code implementation • 23 Jun 2020 • Matthew Ho, Arya Farahi, Markus Michael Rau, Hy Trac
We study methods for reconstructing Bayesian uncertainties on dynamical mass estimates of galaxy clusters using convolutional neural networks (CNNs).
Cosmology and Nongalactic Astrophysics
1 code implementation • 15 Feb 2019 • Matthew Ho, Markus Michael Rau, Michelle Ntampaka, Arya Farahi, Hy Trac, Barnabas Poczos
Our first model, CNN$_\text{1D}$, infers cluster mass directly from the distribution of member galaxy line-of-sight velocities.
Cosmology and Nongalactic Astrophysics