no code implementations • ICLR 2019 • Michael C. Mozer, Denis Kazakov, Robert V. Lindsey
Attractor dynamics are incorporated into the hidden state to `clean up' representations at each step of a sequence.
1 code implementation • 11 Oct 2017 • Michael C. Mozer, Denis Kazakov, Robert V. Lindsey
The CT-GRU arises by interpreting the gates of a GRU as selecting a time scale of memory, and the CT-GRU generalizes the GRU by incorporating multiple time scales of memory and performing context-dependent selection of time scales for information storage and retrieval.
no code implementations • 14 Mar 2016 • Mohammad Khajah, Robert V. Lindsey, Michael C. Mozer
In theoretical cognitive science, there is a tension between highly structured models whose parameters have a direct psychological interpretation and highly complex, general-purpose models whose parameters and representations are difficult to interpret.
no code implementations • NeurIPS 2014 • Robert V. Lindsey, Mohammad Khajah, Michael C. Mozer
First, in three of the five datasets, the skills inferred by our technique support significantly improved predictions of student performance over the expert-provided skills.
no code implementations • NeurIPS 2013 • Robert V. Lindsey, Michael C. Mozer, William J. Huggins, Harold Pashler
For example, in the domain of concept learning, a policy might specify the nature of exemplars chosen over a training sequence.
no code implementations • NeurIPS 2010 • Michael C. Mozer, Harold Pashler, Matthew Wilder, Robert V. Lindsey, Matt Jones, Michael N. Jones
Our decontamination techniques yield an over 20% reduction in the error of human judgments.
no code implementations • NeurIPS 2009 • Harold Pashler, Nicholas Cepeda, Robert V. Lindsey, Ed Vul, Michael C. Mozer
MCM is intriguingly similar to a Bayesian multiscale model of memory (Kording, Tenenbaum, Shadmehr, 2007), yet MCM is better able to account for human declarative memory.