no code implementations • 14 Sep 2022 • Charles B. Delahunt, Noni Gachuhi, Matthew P. Horning
Two factors in particular are crucial to developing algorithms translatable to clinical field settings: (i) Clear understanding of the clinical needs that ML solutions must accommodate; and (ii) task-relevant metrics for guiding and evaluating ML models.
1 code implementation • 12 Nov 2021 • Alan A. Kaptanoglu, Brian M. de Silva, Urban Fasel, Kadierdan Kaheman, Andy J. Goldschmidt, Jared L. Callaham, Charles B. Delahunt, Zachary G. Nicolaou, Kathleen Champion, Jean-Christophe Loiseau, J. Nathan Kutz, Steven L. Brunton
Automated data-driven modeling, the process of directly discovering the governing equations of a system from data, is increasingly being used across the scientific community.
1 code implementation • 8 Nov 2021 • Charles B. Delahunt, J. Nathan Kutz
Second, we propose a technique, applicable to any model discovery method based on x' = f(x), to assess the accuracy of a discovered model in the context of non-unique solutions due to noisy data.
1 code implementation • NeurIPS Workshop Neuro_AI 2019 • Charles B. Delahunt, J. Nathan Kutz
In this work we deploy MothNet, a computational model of the moth olfactory network, as an automatic feature generator.
no code implementations • 5 Aug 2019 • Charles B. Delahunt, Mayoore S. Jaiswal, Matthew P. Horning, Samantha Janko, Clay M. Thompson, Sourabh Kulhare, Liming Hu, Travis Ostbye, Grace Yun, Roman Gebrehiwot, Benjamin K. Wilson, Earl Long, Stephane Proux, Dionicia Gamboa, Peter Chiodini, Jane Carter, Mehul Dhorda, David Isaboke, Bernhards Ogutu, Wellington Oyibo, Elizabeth Villasis, Kyaw Myo Tun, Christine Bachman, David Bell, Courosh Mehanian
Malaria is a life-threatening disease affecting millions.
no code implementations • 26 Jan 2019 • Charles B. Delahunt, Courosh Mehanian, J. Nathan Kutz
To explore this potential resource, we develop a hybrid classifier (Softmax-Pooling Hybrid, $SPH$) that uses Softmax on high-scoring samples, but on low-scoring samples uses a log-likelihood method that pools the information from the full array $D$.
1 code implementation • 23 Aug 2018 • Charles B. Delahunt, J. Nathan Kutz
In this work, we deployed MothNet, a computational model of the insect olfactory network, as an automatic feature generator: Attached as a front-end pre-processor, its Readout Neurons provided new features, derived from the original features, for use by standard ML classifiers.
1 code implementation • 15 Feb 2018 • Charles B. Delahunt, J. Nathan Kutz
The Moth Olfactory Network is among the simplest biological neural systems that can learn, and its architecture includes key structural elements and mechanisms widespread in biological neural nets, such as cascaded networks, competitive inhibition, high intrinsic noise, sparsity, reward mechanisms, and Hebbian plasticity.
1 code implementation • 8 Feb 2018 • Charles B. Delahunt, Jeffrey A. Riffell, J. Nathan Kutz
From a biological perspective, the model provides a valuable tool for examining the role of neuromodulators, like octopamine, in learning, and gives insight into critical interactions between sparsity, Hebbian growth, and stimulation during learning.