UniFuse: Unidirectional Fusion for 360$^{\circ}$ Panorama Depth Estimation

6 Feb 2021  ·  Hualie Jiang, Zhe Sheng, Siyu Zhu, Zilong Dong, Rui Huang ·

Learning depth from spherical panoramas is becoming a popular research topic because a panorama has a full field-of-view of the environment and provides a relatively complete description of a scene. However, applying well-studied CNNs for perspective images to the standard representation of spherical panoramas, i.e., the equirectangular projection, is suboptimal, as it becomes distorted towards the poles. Another representation is the cubemap projection, which is distortion-free but discontinued on edges and limited in the field-of-view. This paper introduces a new framework to fuse features from the two projections, unidirectionally feeding the cubemap features to the equirectangular features only at the decoding stage. Unlike the recent bidirectional fusion approach operating at both the encoding and decoding stages, our fusion scheme is much more efficient. Besides, we also designed a more effective fusion module for our fusion scheme. Experiments verify the effectiveness of our proposed fusion strategy and module, and our model achieves state-of-the-art performance on four popular datasets. Additional experiments show that our model also has the advantages of model complexity and generalization capability.The code is available at https://github.com/alibaba/UniFuse-Unidirectional-Fusion.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Depth Estimation Matterport3D UniFuse Abs Rel 0.1063 # 1
Depth Estimation Stanford2D3D Panoramic UniFuse with fusion RMSE 0.3691 # 11
absolute relative error 0.1114 # 11

Methods


No methods listed for this paper. Add relevant methods here