1 code implementation • 28 Apr 2023 • Sean Deyo, Veit Elser
We introduce the logical grammar emdebbing (LGE), a model inspired by pregroup grammars and categorial grammars to enable unsupervised inference of lexical categories and syntactic rules from a corpus of text.
1 code implementation • 27 Apr 2023 • Sean Deyo, Veit Elser
We use a binary attribute representation (BAR) model to describe a data set of Netflix viewers' ratings of movies.
no code implementations • 18 Jan 2022 • Sean Deyo, Veit Elser
We implement a divide-and-concur iterative projection approach to context-free grammar inference.
no code implementations • 9 Jun 2021 • Sean Deyo, Veit Elser
Iterative projection methods may become trapped at non-solutions when the constraint sets are nonconvex.
1 code implementation • 3 Dec 2020 • Veit Elser
Recent experiments by Springer and Kenyon have shown that a deep neural network can be trained to predict the action of $t$ steps of Conway's Game of Life automaton given millions of examples of this action on random initial states.
1 code implementation • 29 Oct 2019 • Veit Elser
The same approach is very successful in phase retrieval, where signals are reconstructed from magnitude constraints and general characteristics (sparsity, support, etc.).
no code implementations • 10 May 2018 • Veit Elser, Dan Schmidt, Jonathan Yedidia
This constraint is simply that the value of the output node associated with the correct class should be zero.
no code implementations • 26 Jan 2016 • Veit Elser
We study neural networks whose only non-linear components are multipliers, to test a new training rule in a context where the precise representation of data is paramount.
no code implementations • 8 May 2013 • Nate Derbinsky, José Bento, Veit Elser, Jonathan S. Yedidia
We describe how the powerful "Divide and Concur" algorithm for constraint satisfaction can be derived as a special case of a message-passing version of the Alternating Direction Method of Multipliers (ADMM) algorithm for convex optimization, and introduce an improved message-passing algorithm based on ADMM/DC by introducing three distinct weights for messages, with "certain" and "no opinion" weights, as well as the standard weight used in ADMM/DC.