Search Results for author: Cedric Pradalier

Found 5 papers, 1 papers with code

Gaussian Latent Representations for Uncertainty Estimation using Mahalanobis Distance in Deep Classifiers

1 code implementation23 May 2023 Aishwarya Venkataramanan, Assia Benbihi, Martin Laviale, Cedric Pradalier

Recent works show that the data distribution in a network's latent space is useful for estimating classification uncertainty and detecting Out-of-distribution (OOD) samples.

Decision Making Under Uncertainty Representation Learning

How To Train Your HERON

no code implementations20 Feb 2021 Antoine Richard, Stephanie Aravecchia, Thomas Schillaci, Matthieu Geist, Cedric Pradalier

In this paper we apply Deep Reinforcement Learning (Deep RL) and Domain Randomization to solve a navigation task in a natural environment relying solely on a 2D laser scanner.

reinforcement-learning Reinforcement Learning (RL)

A Study on Trees's Knots Prediction from their Bark Outer-Shape

no code implementations5 Oct 2020 Mejri Mohamed, Antoine Richard, Cedric Pradalier

In the industry, the value of wood-logs strongly depends on their internal structure and more specifically on the knots' distribution inside the trees.

Robust Monocular Edge Visual Odometry through Coarse-to-Fine Data Association

no code implementations25 Sep 2019 Xiaolong Wu, Patricio Vela, Cedric Pradalier

In this work, we propose a monocular visual odometry framework, which allows exploiting the best attributes of edge feature for illumination-robust camera tracking, while at the same time ameliorating the performance degradation of edge mapping.

Monocular Visual Odometry Motion Estimation +1

Semantic Nearest Neighbor Fields Monocular Edge Visual-Odometry

no code implementations1 Apr 2019 Xiaolong Wu, Assia Benbihi, Antoine Richard, Cedric Pradalier

The core of our approach is a semantic nearest neighbor field that facilitates a robust data association of edges across frames using semantics.

Edge Detection Monocular Visual Odometry +1

Cannot find the paper you are looking for? You can Submit a new open access paper.