Search Results for author: Kevin Ellis

Found 22 papers, 5 papers with code

WorldCoder, a Model-Based LLM Agent: Building World Models by Writing Code and Interacting with the Environment

no code implementations19 Feb 2024 Hao Tang, Darren Key, Kevin Ellis

We give a model-based agent that builds a Python program representing its knowledge of the world based on its interactions with the environment.

Program Synthesis

Doing Experiments and Revising Rules with Natural Language and Probabilistic Reasoning

no code implementations8 Feb 2024 Wasu Top Piriyakulkij, Kevin Ellis

We build a computational model of how humans actively infer hidden rules by doing experiments.

Active Preference Inference using Language Models and Probabilistic Reasoning

no code implementations19 Dec 2023 Top Piriyakulkij, Volodymyr Kuleshov, Kevin Ellis

To enable this ability for instruction-tuned large language models (LLMs), one may prompt them to ask users questions to infer their preferences, transforming the language models into more robust, interactive systems.

Decision Making

Rapid Motor Adaptation for Robotic Manipulator Arms

no code implementations7 Dec 2023 Yichao Liang, Kevin Ellis, João Henriques

Drawing inspiration from RMA in locomotion and in-hand rotation, we use depth perception to develop agents tailored for rapid motor adaptation in a variety of manipulation tasks.

Friction

Top-Down Synthesis for Library Learning

1 code implementation29 Nov 2022 Matthew Bowers, Theo X. Olausson, Lionel Wong, Gabriel Grand, Joshua B. Tenenbaum, Kevin Ellis, Armando Solar-Lezama

This paper introduces corpus-guided top-down synthesis as a mechanism for synthesizing library functions that capture common functionality from a corpus of programs in a domain specific language (DSL).

Toward Trustworthy Neural Program Synthesis

no code implementations29 Sep 2022 Darren Key, Wen-Ding Li, Kevin Ellis

We develop an approach to estimate the probability that a program sampled from a large language model is correct.

Language Modelling Large Language Model +1

From Perception to Programs: Regularize, Overparameterize, and Amortize

no code implementations13 Jun 2022 Hao Tang, Kevin Ellis

Toward combining inductive reasoning with perception abilities, we develop techniques for neurosymbolic program synthesis where perceptual input is first parsed by neural nets into a low-dimensional interpretable representation, which is then processed by a synthesized program.

Program Synthesis

Efficient Pragmatic Program Synthesis with Informative Specifications

no code implementations5 Apr 2022 Saujas Vaduguru, Kevin Ellis, Yewen Pu

Surprisingly, we find that the synthesizer assuming a factored approximation performs better than a synthesizer assuming an exact joint distribution when evaluated on natural human inputs.

Program Synthesis

CrossBeam: Learning to Search in Bottom-Up Program Synthesis

1 code implementation ICLR 2022 Kensen Shi, Hanjun Dai, Kevin Ellis, Charles Sutton

Many approaches to program synthesis perform a search within an enormous space of programs to find one that satisfies a given specification.

Program Synthesis Structured Prediction

Scaling Neural Program Synthesis with Distribution-based Search

1 code implementation24 Oct 2021 Nathanaël Fijalkow, Guillaume Lagarde, Théo Matricon, Kevin Ellis, Pierre Ohlmann, Akarsh Potta

We investigate how to augment probabilistic and neural program synthesis methods with new search algorithms, proposing a framework called distribution-based search.

Program Synthesis

Hybrid Memoised Wake-Sleep: Approximate Inference at the Discrete-Continuous Interface

no code implementations ICLR 2022 Tuan Anh Le, Katherine M. Collins, Luke Hewitt, Kevin Ellis, N. Siddharth, Samuel J. Gershman, Joshua B. Tenenbaum

We build on a recent approach, Memoised Wake-Sleep (MWS), which alleviates part of the problem by memoising discrete variables, and extend it to allow for a principled and effective way to handle continuous variables by learning a separate recognition model used for importance-sampling based approximate inference and marginalization.

Scene Understanding Time Series +1

Leveraging Language to Learn Program Abstractions and Search Heuristics

no code implementations18 Jun 2021 Catherine Wong, Kevin Ellis, Joshua B. Tenenbaum, Jacob Andreas

Inductive program synthesis, or inferring programs from examples of desired behavior, offers a general paradigm for building interpretable, robust, and generalizable machine learning systems.

Program Synthesis

Program Synthesis with Pragmatic Communication

no code implementations NeurIPS 2020 Yewen Pu, Kevin Ellis, Marta Kryven, Josh Tenenbaum, Armando Solar-Lezama

Given a specification, we score a candidate program both on its consistency with the specification, and also whether a rational speaker would chose this particular specification to communicate that program.

Inductive Bias Program Synthesis

DreamCoder: Growing generalizable, interpretable knowledge with wake-sleep Bayesian program learning

3 code implementations15 Jun 2020 Kevin Ellis, Catherine Wong, Maxwell Nye, Mathias Sable-Meyer, Luc Cary, Lucas Morales, Luke Hewitt, Armando Solar-Lezama, Joshua B. Tenenbaum

It builds expertise by creating programming languages for expressing domain concepts, together with neural networks to guide the search for programs within these languages.

Drawing Pictures Program induction +1

Write, Execute, Assess: Program Synthesis with a REPL

no code implementations NeurIPS 2019 Kevin Ellis, Maxwell Nye, Yewen Pu, Felix Sosa, Josh Tenenbaum, Armando Solar-Lezama

We present a neural program synthesis approach integrating components which write, execute, and assess code to navigate the search space of possible programs.

Navigate Program Synthesis

Learning to Infer and Execute 3D Shape Programs

no code implementations ICLR 2019 Yonglong Tian, Andrew Luo, Xingyuan Sun, Kevin Ellis, William T. Freeman, Joshua B. Tenenbaum, Jiajun Wu

Human perception of 3D shapes goes beyond reconstructing them as a set of points or a composition of geometric primitives: we also effortlessly understand higher-level shape structure such as the repetition and reflective symmetry of object parts.

Learning Libraries of Subroutines for Neurally–Guided Bayesian Program Induction

no code implementations NeurIPS 2018 Kevin Ellis, Lucas Morales, Mathias Sablé-Meyer, Armando Solar-Lezama, Josh Tenenbaum

Successful approaches to program induction require a hand-engineered domain-specific language (DSL), constraining the space of allowed programs and imparting prior knowledge of the domain.

Program induction regression +1

Library Learning for Neurally-Guided Bayesian Program Induction

no code implementations1 Dec 2018 Kevin Ellis, Lucas Morales, Mathias Sablé-Meyer, Armando Solar-Lezama, Joshua B. Tenenbaum

Successful approaches to program induction require a hand-engineered domain-specific language (DSL), constraining the space of allowed programs and imparting prior knowledge of the domain.

Program induction regression +1

Sampling for Bayesian Program Learning

no code implementations NeurIPS 2016 Kevin Ellis, Armando Solar-Lezama, Josh Tenenbaum

Towards learning programs from data, we introduce the problem of sampling programs from posterior distributions conditioned on that data.

Program Synthesis

Unsupervised Learning by Program Synthesis

no code implementations NeurIPS 2015 Kevin Ellis, Armando Solar-Lezama, Josh Tenenbaum

We introduce an unsupervised learning algorithmthat combines probabilistic modeling with solver-based techniques for program synthesis. We apply our techniques to both a visual learning domain and a language learning problem, showing that our algorithm can learn many visual concepts from only a few examplesand that it can recover some English inflectional morphology. Taken together, these results give both a new approach to unsupervised learning of symbolic compositional structures, and a technique for applying program synthesis tools to noisy data.

Program Synthesis

Cannot find the paper you are looking for? You can Submit a new open access paper.