Sensor Fusion is the broad category of combining various on-board sensors to produce better measurement estimates. These sensors are combined to compliment each other and overcome individual shortcomings.
Besides the common video and audio sensors, the system also includes a thermal infrared camera, which is shown to be a feasible solution to the drone detection task.
Our approach is capable of using a multi-sensor platform to build a three-dimensional semantic voxelized map that considers the uncertainty of all of the processes involved.
With the rapid development of intelligent vehicles and Advanced Driving Assistance Systems (ADAS), a mixed level of human driver engagements is involved in the transportation system.
Since DARPA Grand Challenges (rural) in 2004/05 and Urban Challenges in 2007, autonomous driving has been the most active field of AI applications.
The fusion of multimodal sensor streams, such as camera, lidar, and radar measurements, plays a critical role in object detection for autonomous vehicles, which base their decision making on these inputs.
Inertial measurement units are commonly used to estimate the attitude of moving objects.
Rescue vessels are the main actors in maritime safety and rescue operations.