Sensor Fusion

92 papers with code • 0 benchmarks • 2 datasets

Sensor fusion is the process of combining sensor data or data derived from disparate sources such that the resulting information has less uncertainty than would be possible when these sources were used individually. [Wikipedia]

Most implemented papers

Integrating Generic Sensor Fusion Algorithms with Sound State Representations through Encapsulation of Manifolds

parzival2/pixhawk_ekf_python 6 Jul 2011

Common estimation algorithms, such as least squares estimation or the Kalman filter, operate on a state in a state space S that is represented as a real-valued vector.

Distributed Mapping with Privacy and Communication Constraints: Lightweight Algorithms and Object-based Models

CogRob/distributed-mapper 11 Feb 2017

Our field tests show that the combined use of our distributed algorithms and object-based models reduces the communication requirements by several orders of magnitude and enables distributed mapping with large teams of robots in real-world problems.

Distributed Deep Neural Networks over the Cloud, the Edge and End Devices

kunglab/ddnn 6 Sep 2017

In our experiment, compared with the traditional method of offloading raw sensor data to be processed in the cloud, DDNN locally processes most sensor data on end devices while achieving high accuracy and is able to reduce the communication cost by a factor of over 20x.

DeLS-3D: Deep Localization and Segmentation with a 3D Semantic Map

pengwangucla/DeLS-3D CVPR 2018

The uniqueness of our design is a sensor fusion scheme which integrates camera videos, motion sensors (GPS/IMU), and a 3D semantic map in order to achieve robustness and efficiency of the system.

Modular Sensor Fusion for Semantic Segmentation

ethz-asl/modular_semantic_segmentation 30 Jul 2018

Sensor fusion is a fundamental process in robotic systems as it extends the perceptual range and increases robustness in real-world operations.

Online Temporal Calibration for Monocular Visual-Inertial Systems

HKUST-Aerial-Robotics/VINS-Mono 2 Aug 2018

Visual and inertial fusion is a popular technology for 6-DOF state estimation in recent years.

Multimodal Sensor Fusion In Single Thermal image Super-Resolution

fsalmasri/MSF-STI-SR 21 Dec 2018

(III) A bench-mark ULB17-VT dataset that contains thermal images and their visual images counterpart is presented.

Seeing Through Fog Without Seeing Fog: Deep Multimodal Sensor Fusion in Unseen Adverse Weather

princeton-computational-imaging/SeeingThroughFog CVPR 2020

The fusion of multimodal sensor streams, such as camera, lidar, and radar measurements, plays a critical role in object detection for autonomous vehicles, which base their decision making on these inputs.

Tightly Coupled 3D Lidar Inertial Odometry and Mapping

hyye/lio-mapping 15 Apr 2019

By sensor fusion, we can compensate the deficiencies of stand-alone sensors and provide more reliable estimations.

Uncertainty Estimation in One-Stage Object Detection

flkraus/bayesian-yolov3 24 May 2019

Environment perception is the task for intelligent vehicles on which all subsequent steps rely.