no code implementations • 3 Jan 2024 • Jason Moore, Alexander Genkin, Magnus Tournoy, Joshua Pughe-Sanford, Rob R. de Ruyter van Steveninck, Dmitri B. Chklovskii
In the quest to model neuronal function amidst gaps in physiological data, a promising strategy is to develop a normative theory that interprets neuronal physiology as optimizing a computational objective.
2 code implementations • NeurIPS 2021 • Johannes Friedrich, Siavash Golkar, Shiva Farashahi, Alexander Genkin, Anirvan M. Sengupta, Dmitri B. Chklovskii
This network performs system identification and Kalman filtering, without the need for multiple phases with distinct update rules or the knowledge of the noise covariances.
no code implementations • 21 Aug 2019 • Alexander Genkin, Anirvan M. Sengupta, Dmitri Chklovskii
Here, we propose a feed-forward neural network capable of semi-supervised learning on manifolds without using an explicit graph representation.
1 code implementation • NeurIPS 2018 • Anirvan Sengupta, Cengiz Pehlevan, Mariano Tepper, Alexander Genkin, Dmitri Chklovskii
Many neurons in the brain, such as place cells in the rodent hippocampus, have localized receptive fields, i. e., they respond to a small neighborhood of stimulus space.
1 code implementation • 7 Nov 2016 • Ilya Trofimov, Alexander Genkin
Generalized linear model with $L_1$ and $L_2$ regularization is a widely used technique for solving classification, class probability estimation and regression problems.
1 code implementation • 24 Nov 2014 • Ilya Trofimov, Alexander Genkin
Solving logistic regression with L1-regularization in distributed settings is an important problem.