Browse SoTA > Miscellaneous > Sensor Fusion

Sensor Fusion

21 papers with code · Miscellaneous

Sensor Fusion is the broad category of combining various on-board sensors to produce better measurement estimates. These sensors are combined to compliment each other and overcome individual shortcomings.

Source: Real Time Dense Depth Estimation by Fusing Stereo with Sparse Depth Measurements

Benchmarks

No evaluation results yet. Help compare methods by submit evaluation metrics.

Latest papers with code

FusionLane: Multi-Sensor Fusion for Lane Marking Semantic Segmentation Using Deep Neural Networks

9 Mar 2020rolandying/FusionLane

This paper proposes a lane marking semantic segmentation method based on LIDAR and camera fusion deep neural network.

SEMANTIC SEGMENTATION SENSOR FUSION TIME SERIES

22
09 Mar 2020

MonoLayout: Amodal scene layout from a single image

19 Feb 2020hbutsuak95/monolayout

We dub this problem amodal scene layout estimation, which involves "hallucinating" scene layout for even parts of the world that are occluded in the image.

AMODAL LAYOUT ESTIMATION SENSOR FUSION

47
19 Feb 2020

Kalman Filter, Sensor Fusion, and Constrained Regression: Equivalences and Insights

NeurIPS 2019 mariajahja/kf-sf-flu-nowcasting

In this work, we show that the state estimates from the KF in a standard linear dynamical system setting are equivalent to those given by the KF in a transformed system, with infinite process noise (i. e., a ``flat prior'') and an augmented measurement space.

MODEL SELECTION SENSOR FUSION

2
01 Dec 2019

PointPainting: Sequential Fusion for 3D Object Detection

CVPR 2020 rshilliday/painting

Surprisingly, lidar-only methods outperform fusion methods on the main benchmark datasets, suggesting a gap in the literature.

3D OBJECT DETECTION SELF-DRIVING CARS SEMANTIC SEGMENTATION SENSOR FUSION

6
22 Nov 2019

Improvements to Target-Based 3D LiDAR to Camera Calibration

7 Oct 2019UMich-BipedLab/extrinsic_lidar_camera_calibration

The homogeneous transformation between a LiDAR and monocular camera is required for sensor fusion tasks, such as SLAM.

POSE ESTIMATION QUANTIZATION SENSOR FUSION

79
07 Oct 2019

LiDARTag: A Real-Time Fiducial Tag using Point Clouds

23 Aug 2019UMich-BipedLab/extrinsic_lidar_camera_calibration

Image-based fiducial markers are widely used in robotics and computer vision problems such as object tracking in cluttered or textureless environments, camera (and multi-sensor) calibration tasks, or vision-based simultaneous localization and mapping (SLAM).

MOTION CAPTURE OBJECT TRACKING SENSOR FUSION SIMULTANEOUS LOCALIZATION AND MAPPING

79
23 Aug 2019

Uncertainty Estimation in One-Stage Object Detection

24 May 2019flkraus/bayesian-yolov3

Environment perception is the task for intelligent vehicles on which all subsequent steps rely.

OBJECT DETECTION SENSOR FUSION

34
24 May 2019

RRPN: Radar Region Proposal Network for Object Detection in Autonomous Vehicles

1 May 2019mrnabati/RRPN

Region proposal algorithms play an important role in most state-of-the-art two-stage object detection networks by hypothesizing object locations in the image.

AUTONOMOUS DRIVING OBJECT DETECTION REGION PROPOSAL SENSOR FUSION

34
01 May 2019