Drone-view target localization

6 papers with code • 1 benchmarks • 1 datasets

(Drone -> Satellite) Given one drone-view image or video, the task aims to find the most similar satellite-view image to localize the target building in the satellite view.

Most implemented papers

University-1652: A Multi-view Multi-source Benchmark for Drone-based Geo-localization

layumi/University1652-Baseline 27 Feb 2020

To our knowledge, University-1652 is the first drone-based geo-localization dataset and enables two new tasks, i. e., drone-view target localization and drone navigation.

Each Part Matters: Local Patterns Facilitate Cross-view Geo-localization

wtyhub/LPN 26 Aug 2020

Existing methods usually concentrate on mining the fine-grained feature of the geographic target in the image center, but underestimate the contextual information in neighbor areas.

Understanding Image Retrieval Re-Ranking: A Graph Neural Network Perspective

Xuanmeng-Zhang/gnn-re-ranking 14 Dec 2020

We argue that the first phase equals building the k-nearest neighbor graph, while the second phase can be viewed as spreading the message within the graph.

A Transformer-Based Feature Segmentation and Region Alignment Method For UAV-View Geo-Localization

dmmm1997/fsra 23 Jan 2022

However it still has some limitations, e. g., it can only extract part of the information in the neighborhood and some scale reduction operations will make some fine-grained information lost.

Joint Representation Learning and Keypoint Detection for Cross-view Geo-localization

AggMan96/RK-Net IEEE Transactions on Image Processing (TIP) 2022

Inspired by the human visual system for mining local patterns, we propose a new framework called RK-Net to jointly learn the discriminative Representation and detect salient Keypoints with a single Network.

Sample4Geo: Hard Negative Sampling For Cross-View Geo-Localisation

Skyy93/Sample4Geo ICCV 2023

In this article, we present a simplified but effective architecture based on contrastive learning with symmetric InfoNCE loss that outperforms current state-of-the-art results.