Search Results for author: Tankut Can

Found 6 papers, 2 papers with code

Using large language models to study human memory for meaningful narratives

1 code implementation8 Nov 2023 Antonios Georgiou, Tankut Can, Mikhail Katkov, Misha Tsodyks

One of the most impressive achievements of the AI revolution is the development of large language models that can generate meaningful text and respond to instructions in plain English with no additional training necessary.

Trainability, Expressivity and Interpretability in Gated Neural ODEs

1 code implementation12 Jul 2023 Timothy Doyeon Kim, Tankut Can, Kamesh Krishnamurthy

Using this measure, we explore how the phase-space dimension of the nODEs and the complexity of the function modeling the flow field contribute to expressivity.

Inductive Bias Retrieval

Flatter, faster: scaling momentum for optimal speedup of SGD

no code implementations28 Oct 2022 Aditya Cowsik, Tankut Can, Paolo Glorioso

Commonly used optimization algorithms often show a trade-off between good generalization and fast training times.

Emergence of memory manifolds

no code implementations8 Sep 2021 Tankut Can, Kamesh Krishnamurthy

The ability to store continuous variables in the state of a biological system (e. g. a neural network) is critical for many behaviours.

Theory of gating in recurrent neural networks

no code implementations29 Jul 2020 Kamesh Krishnamurthy, Tankut Can, David J. Schwab

The gate modulating the dimensionality can induce a novel, discontinuous chaotic transition, where inputs push a stable system to strong chaotic activity, in contrast to the typically stabilizing effect of inputs.

Gating creates slow modes and controls phase-space complexity in GRUs and LSTMs

no code implementations31 Jan 2020 Tankut Can, Kamesh Krishnamurthy, David J. Schwab

Here, we take the perspective of studying randomly initialized LSTMs and GRUs as dynamical systems, and ask how the salient dynamical properties are shaped by the gates.

Cannot find the paper you are looking for? You can Submit a new open access paper.