Search Results for author: Michal Nazarczuk

Found 5 papers, 0 papers with code

Closed Loop Interactive Embodied Reasoning for Robot Manipulation

no code implementations23 Apr 2024 Michal Nazarczuk, Jan Kristof Behrens, Karla Stepanova, Matej Hoffmann, Krystian Mikolajczyk

Embodied reasoning systems integrate robotic hardware and cognitive processes to perform complex tasks typically in response to a natural language query about a specific physical environment.

Robot Manipulation

SWAGS: Sampling Windows Adaptively for Dynamic 3D Gaussian Splatting

no code implementations20 Dec 2023 Richard Shaw, Jifei Song, Arthur Moreau, Michal Nazarczuk, Sibi Catley-Chandar, Helisa Dhamo, Eduardo Perez-Pellitero

We model the dynamics of a scene using a tunable MLP, which learns the deformation field from a canonical space to a set of 3D Gaussians per frame.

Novel View Synthesis

SAMPLE-HD: Simultaneous Action and Motion Planning Learning Environment

no code implementations1 Jun 2022 Michal Nazarczuk, Tony Ng, Krystian Mikolajczyk

Humans exhibit incredibly high levels of multi-modal understanding - combining visual cues with read, or heard knowledge comes easy to us and allows for very accurate interaction with the surrounding environment.

Motion Planning Question Answering +2

Self-supervised HDR Imaging from Motion and Exposure Cues

no code implementations23 Mar 2022 Michal Nazarczuk, Sibi Catley-Chandar, Ales Leonardis, Eduardo Pérez Pellitero

Recent High Dynamic Range (HDR) techniques extend the capabilities of current cameras where scenes with a wide range of illumination can not be accurately captured with a single low-dynamic-range (LDR) image.

SHOP-VRB: A Visual Reasoning Benchmark for Object Perception

no code implementations6 Apr 2020 Michal Nazarczuk, Krystian Mikolajczyk

In this paper we present an approach and a benchmark for visual reasoning in robotics applications, in particular small object grasping and manipulation.

Object Visual Reasoning

Cannot find the paper you are looking for? You can Submit a new open access paper.