Search Results for author: Pedro F. Proença

Found 6 papers, 2 papers with code

TRADE: Object Tracking with 3D Trajectory and Ground Depth Estimates for UAVs

no code implementations7 Oct 2022 Pedro F. Proença, Patrick Spieler, Robert A. Hewitt, Jeff Delaune

We propose TRADE for robust tracking and 3D localization of a moving target in cluttered environments, from UAVs equipped with a single camera.

Depth Estimation Object Tracking

Optimizing Terrain Mapping and Landing Site Detection for Autonomous UAVs

no code implementations7 May 2022 Pedro F. Proença, Jeff Delaune, Roland Brockers

The next generation of Mars rotorcrafts requires on-board autonomous hazard avoidance landing.

TACO: Trash Annotations in Context for Litter Detection

1 code implementation16 Mar 2020 Pedro F. Proença, Pedro Simões

TACO is an open image dataset for litter detection and segmentation, which is growing through crowdsourcing.

Instance Segmentation Segmentation +1

Fast Cylinder and Plane Extraction from Depth Cameras for Visual Odometry

2 code implementations6 Mar 2018 Pedro F. Proença, Yang Gao

This paper presents CAPE, a method to extract planes and cylinder segments from organized point clouds, which processes 640x480 depth images on a single CPU core at an average of 300 Hz, by operating on a grid of planar cells.

Visual Odometry

SPLODE: Semi-Probabilistic Point and Line Odometry with Depth Estimation from RGB-D Camera Motion

no code implementations9 Aug 2017 Pedro F. Proença, Yang Gao

To address this issue, this paper presents a visual odometry method based on point and line features that leverages both measurements from a depth sensor and depth estimates from camera motion.

Depth Estimation Motion Estimation +1

Probabilistic Combination of Noisy Points and Planes for RGB-D Odometry

no code implementations18 May 2017 Pedro F. Proença, Yang Gao

This work proposes a visual odometry method that combines points and plane primitives, extracted from a noisy depth camera.

Motion Estimation Visual Odometry

Cannot find the paper you are looking for? You can Submit a new open access paper.