1 code implementation • 8 Feb 2024 • Viktor Nilsson, Anirban Samaddar, Sandeep Madireddy, Pierre Nyquist
The approach combines the minimization of the cross-entropy for simple, adaptive base models and the estimation of their deviation, in terms of the relative entropy, from the data density.
no code implementations • 6 Dec 2023 • Tung Nguyen, Rohan Shah, Hritik Bansal, Troy Arcomano, Sandeep Madireddy, Romit Maulik, Veerabhadra Kotamarthi, Ian Foster, Aditya Grover
At the core of Stormer is a randomized forecasting objective that trains the model to forecast the weather dynamics over varying time intervals.
no code implementations • 10 Nov 2023 • Yixuan Sun, Elizabeth Cucuzzella, Steven Brus, Sri Hari Krishna Narayanan, Balu Nadiga, Luke Van Roekel, Jan Hückelheim, Sandeep Madireddy
Modeling is crucial to understanding the effect of greenhouse gases, warming, and ice sheet melting on the ocean.
no code implementations • 25 Oct 2023 • Ray A. O. Sinurat, Anurag Daram, Haryadi S. Gunawi, Robert B. Ross, Sandeep Madireddy
Machine learning-based performance models are increasingly being used to build critical job scheduling and application optimization decisions.
no code implementations • 8 Aug 2023 • Sandeep Madireddy, Angel Yanguas-Gil, Prasanna Balaprakash
The ability to learn continuously from an incoming data stream without catastrophic forgetting is critical to designing intelligent systems.
1 code implementation • 26 Feb 2023 • Angel Yanguas-Gil, Sandeep Madireddy
In this work we have extended AutoML inspired approaches to the exploration and optimization of neuromorphic architectures.
no code implementations • 18 Jan 2023 • Megan M. Baker, Alexander New, Mario Aguilar-Simon, Ziad Al-Halah, Sébastien M. R. Arnold, Ese Ben-Iwhiwhu, Andrew P. Brna, Ethan Brooks, Ryan C. Brown, Zachary Daniels, Anurag Daram, Fabien Delattre, Ryan Dellana, Eric Eaton, Haotian Fu, Kristen Grauman, Jesse Hostetler, Shariq Iqbal, Cassandra Kent, Nicholas Ketz, Soheil Kolouri, George Konidaris, Dhireesha Kudithipudi, Erik Learned-Miller, Seungwon Lee, Michael L. Littman, Sandeep Madireddy, Jorge A. Mendez, Eric Q. Nguyen, Christine D. Piatko, Praveen K. Pilly, Aswin Raghavan, Abrar Rahman, Santhosh Kumar Ramakrishnan, Neale Ratzlaff, Andrea Soltoggio, Peter Stone, Indranil Sur, Zhipeng Tang, Saket Tiwari, Kyle Vedder, Felix Wang, Zifan Xu, Angel Yanguas-Gil, Harel Yedidsion, Shangqun Yu, Gautam K. Vallabha
Despite the advancement of machine learning techniques in recent years, state-of-the-art systems lack robustness to "real world" events, where the input distributions and tasks encountered by the deployed systems will not be limited to the original training context, and systems will instead need to adapt to novel distributions and tasks while deployed.
1 code implementation • 30 Nov 2022 • Angel Yanguas-Gil, Sandeep Madireddy
Our model leverages the offline training of a feature extraction and a common general policy layer to enable the convergence of RL algorithms in online settings.
1 code implementation • 1 Nov 2022 • Aleksandra Ćiprijanović, Ashia Lewis, Kevin Pedro, Sandeep Madireddy, Brian Nord, Gabriel N. Perdue, Stefan M. Wild
For the first time, we demonstrate the successful use of domain adaptation on two very different observational datasets (from SDSS and DECaLS).
no code implementations • 8 Oct 2022 • Sumegha Premchandar, Sandeep Madireddy, Sanket Jantre, Prasanna Balaprakash
To this end, we propose a Unified probabilistic architecture and weight ensembling Neural Architecture Search (UraeNAS) that leverages advances in probabilistic neural architecture search and approximate Bayesian inference to generate ensembles form the joint distribution of neural network architectures and weights.
no code implementations • 3 Oct 2022 • Matthieu Dorier, Romain Egele, Prasanna Balaprakash, Jaehoon Koo, Sandeep Madireddy, Srinivasan Ramesh, Allen D. Malony, Rob Ross
Distributed data storage services tailored to specific applications have grown popular in the high-performance computing (HPC) community as a way to address I/O and storage challenges.
no code implementations • 1 Jun 2022 • Sanket Jantre, Sandeep Madireddy, Shrijita Bhattacharya, Tapabrata Maiti, Prasanna Balaprakash
Deep neural network ensembles that appeal to model diversity have been used successfully to improve predictive performance and model robustness in several applications.
no code implementations • 4 Mar 2022 • Anirban Samaddar, Sandeep Madireddy, Prasanna Balaprakash, Tapabrata Maiti, Gustavo de los Campos, Ian Fischer
In addition, it provides a mechanism for learning a joint distribution of the latent variable and the sparsity and hence can account for the complete uncertainty in the latent space.
no code implementations • 28 Dec 2021 • Aleksandra Ćiprijanović, Diana Kafkes, Gregory Snyder, F. Javier Sánchez, Gabriel Nathan Perdue, Kevin Pedro, Brian Nord, Sandeep Madireddy, Stefan M. Wild
On the other hand, we show that training with domain adaptation improves model robustness and mitigates the effects of these perturbations, improving the classification accuracy by 23% on data with higher observational noise.
no code implementations • 25 Oct 2021 • Allison McCarn Deiana, Nhan Tran, Joshua Agar, Michaela Blott, Giuseppe Di Guglielmo, Javier Duarte, Philip Harris, Scott Hauck, Mia Liu, Mark S. Neubauer, Jennifer Ngadiuba, Seda Ogrenci-Memik, Maurizio Pierini, Thea Aarrestad, Steffen Bahr, Jurgen Becker, Anne-Sophie Berthold, Richard J. Bonventre, Tomas E. Muller Bravo, Markus Diefenthaler, Zhen Dong, Nick Fritzsche, Amir Gholami, Ekaterina Govorkova, Kyle J Hazelwood, Christian Herwig, Babar Khan, Sehoon Kim, Thomas Klijnsma, Yaling Liu, Kin Ho Lo, Tri Nguyen, Gianantonio Pezzullo, Seyedramin Rasoulinezhad, Ryan A. Rivera, Kate Scholberg, Justin Selig, Sougata Sen, Dmitri Strukov, William Tang, Savannah Thais, Kai Lukas Unger, Ricardo Vilalta, Belinavon Krosigk, Thomas K. Warburton, Maria Acosta Flechas, Anthony Aportela, Thomas Calvet, Leonardo Cristella, Daniel Diaz, Caterina Doglioni, Maria Domenica Galati, Elham E Khoda, Farah Fahim, Davide Giri, Benjamin Hawks, Duc Hoang, Burt Holzman, Shih-Chieh Hsu, Sergo Jindariani, Iris Johnson, Raghav Kansal, Ryan Kastner, Erik Katsavounidis, Jeffrey Krupa, Pan Li, Sandeep Madireddy, Ethan Marx, Patrick McCormack, Andres Meza, Jovan Mitrevski, Mohammed Attia Mohammed, Farouk Mokhtar, Eric Moreno, Srishti Nagu, Rohin Narayan, Noah Palladino, Zhiqiang Que, Sang Eon Park, Subramanian Ramamoorthy, Dylan Rankin, Simon Rothman, ASHISH SHARMA, Sioni Summers, Pietro Vischia, Jean-Roch Vlimant, Olivia Weng
In this community review report, we discuss applications and techniques for fast machine learning (ML) in science -- the concept of integrating power ML methods into the real-time experimental data processing loop to accelerate scientific discovery.
no code implementations • 16 Jul 2020 • Sandeep Madireddy, Angel Yanguas-Gil, Prasanna Balaprakash
Using high performing configurations metalearned in the single task learning setting, we achieve superior continual learning performance on Split-MNIST, and Split-CIFAR-10 data as compared with other memory-constrained learning approaches, and match that of the state-of-the-art memory-intensive replay-based approaches.
no code implementations • ICML Workshop LifelongML 2020 • Sandeep Madireddy, Angel Yanguas-Gil, Prasanna Balaprakash
We focus on the problem of how to achieve online continual learning under memory-constrained conditions where the input data may not be known \emph{a priori}.
no code implementations • 10 Nov 2019 • Peihong Jiang, Hieu Doan, Sandeep Madireddy, Rajeev Surendran Assary, Prasanna Balaprakash
Computer-assisted synthesis planning aims to help chemists find better reaction pathways faster.
no code implementations • 10 Nov 2019 • Sandeep Madireddy, Nesar Ramachandra, Nan Li, James Butler, Prasanna Balaprakash, Salman Habib, Katrin Heitmann, The LSST Dark Energy Science Collaboration
Upcoming large astronomical surveys are expected to capture an unprecedented number of strong gravitational lensing systems.
no code implementations • 18 Sep 2019 • Romit Maulik, Vishwas Rao, Sandeep Madireddy, Bethany Lusch, Prasanna Balaprakash
Rapid simulations of advection-dominated problems are vital for multiple engineering and geophysical applications.
no code implementations • 4 Jun 2019 • Sandeep Madireddy, Angel Yanguas-Gil, Prasanna Balaprakash
Our results show that optimal learning rules can be dataset-dependent even within similar tasks.