no code implementations • 17 Nov 2022 • Shreyas Sundara Raman, Vanya Cohen, Ifrah Idrees, Eric Rosen, Ray Mooney, Stefanie Tellex, David Paulius
Our improvements transfer to a Boston Dynamics Spot robot initialized with a set of skills (specified in language) and associated preconditions, where CAPE improves the correctness metric of the executed task plans by 76. 49% compared to SayCan.
no code implementations • 12 Jul 2022 • David Paulius, Alejandro Agostini, Dongheui Lee
We demonstrate our entire approach on long-horizon tasks in CoppeliaSim and show how learned action contexts can be extended to never-before-seen scenarios.
no code implementations • 5 Apr 2022 • Rafik Ayari, Matteo Pantano, David Paulius
Our initial results show that FOON can be used for an industrial use case and that we can use existing linked data models in LfD applications.
no code implementations • 4 Dec 2021 • Md. Sadman Sakib, David Paulius, Yu Sun
To address the problem of producing novel and flexible task plans called task trees, we explore how we can derive plans with concepts not originally in the robot's knowledge base.
no code implementations • 1 Jun 2021 • David Paulius, Alejandro Agostini, Yu Sun, Dongheui Lee
Following work on joint object-action representations, the functional object-oriented network (FOON) was introduced as a knowledge graph representation for robots.
no code implementations • 1 Jun 2021 • Md Sadman Sakib, Hailey Baez, David Paulius, Yu Sun
We first automatically convert task trees to recipes, and we then compare them with the human-created recipes in the Recipe1M+ dataset via a survey.
no code implementations • 10 Dec 2020 • Maxat Alibayev, David Paulius, Yu Sun
In this work, we propose a motion embedding strategy known as motion codes, which is a vectorized representation of motions based on a manipulation's salient mechanical attributes.
no code implementations • 31 Jul 2020 • Maxat Alibayev, David Paulius, Yu Sun
A motion taxonomy can encode manipulations as a binary-encoded representation, which we refer to as motion codes.
no code implementations • 13 Jul 2020 • David Paulius, Nicholas Eales, Yu Sun
To represent motions from a mechanical point of view, this paper explores motion embedding using the motion taxonomy.
no code implementations • 1 Oct 2019 • David Paulius, Yongqiang Huang, Jason Meloncon, Yu Sun
This paper introduces a taxonomy of manipulations as seen especially in cooking for 1) grouping manipulations from the robotics point of view, 2) consolidating aliases and removing ambiguity for motion types, and 3) provide a path to transferring learned manipulations to new unlearned manipulations.
1 code implementation • 1 May 2019 • David Paulius, Kelvin Sheng Pei Dong, Yu Sun
The paper also presents a task planning algorithm for the weighted FOON to allocate manipulation action load to the robot and human to achieve optimal performance while minimizing human effort.
1 code implementation • 5 Jul 2018 • David Paulius, Ahmad Babaeian Jelodar, Yu Sun
To further improve the performance of knowledge retrieval as a follow up to our previous work, we discuss generalizing knowledge to be applied to objects which are similar to what we have in FOON without manually annotating new sources of knowledge.
no code implementations • 5 Jul 2018 • David Paulius, Yu Sun
Within the realm of service robotics, researchers have placed a great amount of effort into learning, understanding, and representing motions as manipulations for task execution by robots.
no code implementations • 3 Jul 2018 • Ahmad Babaeian Jelodar, David Paulius, Yu Sun
Each action is therefore associated with a functional unit and the sequence of actions is further evaluated to identify the single on-going activity in the video.