Search Results for author: Tobias Golling

Found 21 papers, 12 papers with code

Masked Particle Modeling on Sets: Towards Self-Supervised High Energy Physics Foundation Models

1 code implementation24 Jan 2024 Lukas Heinrich, Tobias Golling, Michael Kagan, Samuel Klein, Matthew Leigh, Margarita Osadchy, John Andrew Raine

We propose masked particle modeling (MPM) as a self-supervised method for learning generic, transferable, and reusable representations on unordered sets of inputs for use in high energy physics (HEP) scientific data.

Self-Supervised Learning

Improving new physics searches with diffusion models for event observables and jet constituents

no code implementations15 Dec 2023 Debajyoti Sengupta, Matthew Leigh, John Andrew Raine, Samuel Klein, Tobias Golling

We introduce a new technique called Drapes to enhance the sensitivity in searches for new physics at the LHC.

EPiC-ly Fast Particle Cloud Generation with Flow-Matching and Diffusion

no code implementations29 Sep 2023 Erik Buhmann, Cedric Ewen, Darius A. Faroughy, Tobias Golling, Gregor Kasieczka, Matthew Leigh, Guillaume Quétant, John Andrew Raine, Debajyoti Sengupta, David Shih

In addition, we introduce \epcfm, the first permutation equivariant continuous normalizing flow (CNF) for particle cloud generation.

Flows for Flows: Morphing one Dataset into another with Maximum Likelihood Estimation

no code implementations12 Sep 2023 Tobias Golling, Samuel Klein, Radha Mastandrea, Benjamin Nachman, John Andrew Raine

We propose a protocol called flows for flows for training normalizing flows to morph one dataset into another even if the underlying probability density of neither dataset is known explicitly.

MORPH

PC-Droid: Faster diffusion and improved quality for particle cloud generation

no code implementations13 Jul 2023 Matthew Leigh, Debajyoti Sengupta, John Andrew Raine, Guillaume Quétant, Tobias Golling

Building on the success of PC-JeDi we introduce PC-Droid, a substantially improved diffusion model for the generation of jet particle clouds.

Decorrelation using Optimal Transport

1 code implementation11 Jul 2023 Malte Algren, John Andrew Raine, Tobias Golling

Being able to decorrelate a feature space from protected attributes is an area of active research and study in ethics, fairness, and also natural sciences.

Binary Classification Ethics +2

$ν^2$-Flows: Fast and improved neutrino reconstruction in multi-neutrino final states with conditional normalizing flows

1 code implementation5 Jul 2023 John Andrew Raine, Matthew Leigh, Knut Zoch, Tobias Golling

In this work we introduce $\nu^2$-Flows, an extension of the $\nu$-Flows method to final states containing multiple neutrinos.

CURTAINs Flows For Flows: Constructing Unobserved Regions with Maximum Likelihood Estimation

no code implementations8 May 2023 Debajyoti Sengupta, Samuel Klein, John Andrew Raine, Tobias Golling

Model independent techniques for constructing background data templates using generative models have shown great promise for use in searches for new physics processes at the LHC.

Anomaly Detection

Flow Away your Differences: Conditional Normalizing Flows as an Improvement to Reweighting

no code implementations28 Apr 2023 Malte Algren, Tobias Golling, Manuel Guth, Chris Pollard, John Andrew Raine

We present an alternative to reweighting techniques for modifying distributions to account for a desired change in an underlying conditional distribution, as is often needed to correct for mis-modelling in a simulated sample.

Topological Reconstruction of Particle Physics Processes using Graph Neural Networks

1 code implementation24 Mar 2023 Lukas Ehrke, John Andrew Raine, Knut Zoch, Manuel Guth, Tobias Golling

We present a new approach, the Topograph, which reconstructs underlying physics processes, including the intermediary particles, by leveraging underlying priors from the nature of particle physics decays and the flexibility of message passing graph neural networks.

Flows for Flows: Training Normalizing Flows Between Arbitrary Distributions with Maximum Likelihood Estimation

1 code implementation4 Nov 2022 Samuel Klein, John Andrew Raine, Tobias Golling

Normalizing flows are constructed from a base distribution with a known density and a diffeomorphism with a tractable Jacobian.

Decorrelation with conditional normalizing flows

1 code implementation4 Nov 2022 Samuel Klein, Tobias Golling

The sensitivity of many physics analyses can be enhanced by constructing discriminants that preferentially select signal events.

Flowification: Everything is a Normalizing Flow

1 code implementation30 May 2022 Bálint Máté, Samuel Klein, Tobias Golling, François Fleuret

On the other hand, neural networks only perform a forward pass on the input, there is neither a notion of an inverse of a neural network nor is there one of its likelihood contribution.

Density Estimation

Turbo-Sim: a generalised generative model with a physical latent space

no code implementations20 Dec 2021 Guillaume Quétant, Mariia Drozdova, Vitaliy Kinakh, Tobias Golling, Slava Voloshynovskiy

We present Turbo-Sim, a generalised autoencoder framework derived from principles of information theory that can be used as a generative model.

Information-theoretic stochastic contrastive conditional GAN: InfoSCC-GAN

1 code implementation17 Dec 2021 Vitaliy Kinakh, Mariia Drozdova, Guillaume Quétant, Tobias Golling, Slava Voloshynovskiy

The InfoSCC-GAN architecture is based on an unsupervised contrastive encoder built on the InfoNCE paradigm, an attribute classifier and an EigenGAN generator.

Attribute Generative Adversarial Network +1

Funnels: Exact maximum likelihood with dimensionality reduction

1 code implementation15 Dec 2021 Samuel Klein, John A. Raine, Sebastian Pina-Otey, Slava Voloshynovskiy, Tobias Golling

Normalizing flows are diffeomorphic, typically dimension-preserving, models trained using the likelihood of the model.

Dimensionality Reduction

Hashing and metric learning for charged particle tracking

no code implementations16 Jan 2021 Sabrina Amrouche, Moritz Kiehn, Tobias Golling, Andreas Salzburger

We propose a novel approach to charged particle tracking at high intensity particle colliders based on Approximate Nearest Neighbors search.

Metric Learning

Variational Autoencoders for Anomalous Jet Tagging

1 code implementation3 Jul 2020 Taoli Cheng, Jean-François Arguin, Julien Leissner-Martin, Jacinthe Pilette, Tobias Golling

To build a performant mass-decorrelated anomalous jet tagger, we propose the Outlier Exposed VAE (OE-VAE), for which some outlier samples are introduced in the training process to guide the learned information.

Jet Tagging Outlier Detection

Machine Learning in High Energy Physics Community White Paper

no code implementations8 Jul 2018 Kim Albertsson, Piero Altoe, Dustin Anderson, John Anderson, Michael Andrews, Juan Pedro Araque Espinosa, Adam Aurisano, Laurent Basara, Adrian Bevan, Wahid Bhimji, Daniele Bonacorsi, Bjorn Burkle, Paolo Calafiura, Mario Campanelli, Louis Capps, Federico Carminati, Stefano Carrazza, Yi-fan Chen, Taylor Childers, Yann Coadou, Elias Coniavitis, Kyle Cranmer, Claire David, Douglas Davis, Andrea De Simone, Javier Duarte, Martin Erdmann, Jonas Eschle, Amir Farbin, Matthew Feickert, Nuno Filipe Castro, Conor Fitzpatrick, Michele Floris, Alessandra Forti, Jordi Garra-Tico, Jochen Gemmler, Maria Girone, Paul Glaysher, Sergei Gleyzer, Vladimir Gligorov, Tobias Golling, Jonas Graw, Lindsey Gray, Dick Greenwood, Thomas Hacker, John Harvey, Benedikt Hegner, Lukas Heinrich, Ulrich Heintz, Ben Hooberman, Johannes Junggeburth, Michael Kagan, Meghan Kane, Konstantin Kanishchev, Przemysław Karpiński, Zahari Kassabov, Gaurav Kaul, Dorian Kcira, Thomas Keck, Alexei Klimentov, Jim Kowalkowski, Luke Kreczko, Alexander Kurepin, Rob Kutschke, Valentin Kuznetsov, Nicolas Köhler, Igor Lakomov, Kevin Lannon, Mario Lassnig, Antonio Limosani, Gilles Louppe, Aashrita Mangu, Pere Mato, Narain Meenakshi, Helge Meinhard, Dario Menasce, Lorenzo Moneta, Seth Moortgat, Mark Neubauer, Harvey Newman, Sydney Otten, Hans Pabst, Michela Paganini, Manfred Paulini, Gabriel Perdue, Uzziel Perez, Attilio Picazio, Jim Pivarski, Harrison Prosper, Fernanda Psihas, Alexander Radovic, Ryan Reece, Aurelius Rinkevicius, Eduardo Rodrigues, Jamal Rorie, David Rousseau, Aaron Sauers, Steven Schramm, Ariel Schwartzman, Horst Severini, Paul Seyfert, Filip Siroky, Konstantin Skazytkin, Mike Sokoloff, Graeme Stewart, Bob Stienen, Ian Stockdale, Giles Strong, Wei Sun, Savannah Thais, Karen Tomko, Eli Upfal, Emanuele Usai, Andrey Ustyuzhanin, Martin Vala, Justin Vasel, Sofia Vallecorsa, Mauro Verzetti, Xavier Vilasís-Cardona, Jean-Roch Vlimant, Ilija Vukotic, Sean-Jiun Wang, Gordon Watts, Michael Williams, Wenjing Wu, Stefan Wunsch, Kun Yang, Omar Zapata

In this document we discuss promising future research and development areas for machine learning in particle physics.

BIG-bench Machine Learning Vocal Bursts Intensity Prediction

Cannot find the paper you are looking for? You can Submit a new open access paper.