Autonomous Navigation
131 papers with code • 0 benchmarks • 5 datasets
Autonomous navigation is the task of autonomously navigating a vehicle or robot to or around a location without human guidance.
( Image credit: Approximate LSTMs for Time-Constrained Inference: Enabling Fast Reaction in Self-Driving Cars )
Benchmarks
These leaderboards are used to track progress in Autonomous Navigation
Latest papers
SUMMIT: Source-Free Adaptation of Uni-Modal Models to Multi-Modal Targets
In this work, we relax both of these assumptions by addressing the problem of adapting a set of models trained independently on uni-modal data to a target domain consisting of unlabeled multi-modal data, without having access to the original source dataset.
Efficient-VRNet: An Exquisite Fusion Network for Riverway Panoptic Perception based on Asymmetric Fair Fusion of Vision and 4D mmWave Radar
In this paper, we focus on riverway panoptic perception based on USVs, which is a considerably unexplored field compared with road panoptic perception.
Vision-Based Autonomous Navigation for Unmanned Surface Vessel in Extreme Marine Conditions
To overcome these issues, this paper presents an autonomous vision-based navigation framework for tracking target objects in extreme marine conditions.
Improving Generalization of Synthetically Trained Sonar Image Descriptors for Underwater Place Recognition
Autonomous navigation in underwater environments presents challenges due to factors such as light absorption and water turbidity, limiting the effectiveness of optical sensors.
Trust-aware Safe Control for Autonomous Navigation: Estimation of System-to-human Trust for Trust-adaptive Control Barrier Functions
A trust-aware safe control system for autonomous navigation in the presence of humans, specifically pedestrians, is presented.
Achelous: A Fast Unified Water-surface Panoptic Perception Framework based on Fusion of Monocular Camera and 4D mmWave Radar
Current perception models for different tasks usually exist in modular forms on Unmanned Surface Vehicles (USVs), which infer extremely slowly in parallel on edge devices, causing the asynchrony between perception results and USV position, and leading to error decisions of autonomous navigation.
GP-guided MPPI for Efficient Navigation in Complex Unknown Cluttered Environments
This study presents the GP-MPPI, an online learning-based control strategy that integrates MPPI with a local perception model based on Sparse Gaussian Process (SGP).
Real-time Vision-based Navigation for a Robot in an Indoor Environment
The findings contribute to the advancement of indoor robot navigation, showcasing the potential of vision-based techniques for real-time, autonomous navigation.
Boosting Adversarial Robustness using Feature Level Stochastic Smoothing
Advances in adversarial defenses have led to a significant improvement in the robustness of Deep Neural Networks.
Agronav: Autonomous Navigation Framework for Agricultural Robots and Vehicles using Semantic Segmentation and Semantic Line Detection
The successful implementation of vision-based navigation in agricultural fields hinges upon two critical components: 1) the accurate identification of key components within the scene, and 2) the identification of lanes through the detection of boundary lines that separate the crops from the traversable ground.