Browse SoTA > Robots > Visual Navigation

Visual Navigation

27 papers with code · Robots

Leaderboards

Greatest papers with code

Cognitive Mapping and Planning for Visual Navigation

CVPR 2017 tensorflow/models

The accumulated belief of the world enables the agent to track visited regions of the environment.

VISUAL NAVIGATION

Visual Representations for Semantic Target Driven Navigation

15 May 2018tensorflow/models

We propose to using high level semantic and contextual features including segmentation and detection masks obtained by off-the-shelf state-of-the-art vision as observations and use deep network to learn the navigation policy.

DOMAIN ADAPTATION VISUAL NAVIGATION

Reinforced Cross-Modal Matching and Self-Supervised Imitation Learning for Vision-Language Navigation

CVPR 2019 extreme-assistant/cvpr2020

Vision-language navigation (VLN) is the task of navigating an embodied agent to carry out natural language instructions inside real 3D environments.

IMITATION LEARNING VISION-LANGUAGE NAVIGATION VISUAL NAVIGATION

Are We Making Real Progress in Simulated Environments? Measuring the Sim2Real Gap in Embodied Visual Navigation

13 Dec 2019facebookresearch/habitat-api

We find that SRCC for Habitat as used for the CVPR19 challenge is low (0. 18 for the success metric), which suggests that performance improvements for this simulator-based challenge would not transfer well to a physical robot.

POINTGOAL NAVIGATION VISUAL NAVIGATION

An Open Source and Open Hardware Deep Learning-powered Visual Navigation Engine for Autonomous Nano-UAVs

10 May 2019pulp-platform/pulp-dronet

Nano-size unmanned aerial vehicles (UAVs), with few centimeters of diameter and sub-10 Watts of total power budget, have so far been considered incapable of running sophisticated visual-based autonomous navigation software without external aid from base-stations, ad-hoc local positioning infrastructure, and powerful external computation servers.

AUTONOMOUS NAVIGATION VISUAL NAVIGATION

A 64mW DNN-based Visual Navigation Engine for Autonomous Nano-Drones

4 May 2018pulp-platform/pulp-dronet

As part of our general methodology we discuss the software mapping techniques that enable the state-of-the-art deep convolutional neural network presented in [1] to be fully executed on-board within a strict 6 fps real-time constraint with no compromise in terms of flight results, while all processing is done with only 64 mW on average.

AUTONOMOUS NAVIGATION VISUAL NAVIGATION

Vision-and-Language Navigation: Interpreting visually-grounded navigation instructions in real environments

CVPR 2018 peteanderson80/Matterport3DSimulator

This is significant because a robot interpreting a natural-language navigation instruction on the basis of what it sees is carrying out a vision and language process that is similar to Visual Question Answering.

VISUAL NAVIGATION VISUAL QUESTION ANSWERING

Learning to Learn How to Learn: Self-Adaptive Visual Navigation Using Meta-Learning

CVPR 2019 allenai/savn

In this paper we study the problem of learning to learn at both training and test time in the context of visual navigation.

META-LEARNING META REINFORCEMENT LEARNING VISUAL NAVIGATION

The Regretful Agent: Heuristic-Aided Navigation through Progress Estimation

CVPR 2019 chihyaoma/regretful-agent

As deep learning continues to make progress for challenging perception tasks, there is increased interest in combining vision, language, and decision-making.

DECISION MAKING VISION-LANGUAGE NAVIGATION VISUAL NAVIGATION