Motion-Plane-Adaptive Inter Prediction in 360-Degree Video Coding

7 Feb 2022  ·  Andy Regensky, Christian Herglotz, André Kaup ·

Inter prediction is one of the key technologies enabling the high compression efficiency of modern video coding standards. 360-degree video needs to be mapped to the 2D image plane prior to coding in order to allow compression using existing video coding standards. The distortions that inevitably occur when mapping spherical data onto the 2D image plane, however, impair the performance of classical inter prediction techniques. In this paper, we propose a motion-plane-adaptive inter prediction technique (MPA) for 360-degree video that takes the spherical characteristics of 360-degree video into account. Based on the known projection format of the video, MPA allows to perform inter prediction on different motion planes in 3D space instead of having to work on the - in theory arbitrarily mapped - 2D image representation directly. We furthermore derive a motion-plane-adaptive motion vector prediction technique (MPA-MVP) that allows to translate motion information between different motion planes and motion models. Our proposed integration of MPA together with MPA-MVP into the state-of-the-art H.266/VVC video coding standard shows significant Bjontegaard Delta rate savings of 1.72% with a peak of 3.97% based on PSNR and 1.56% with a peak of 3.40% based on WS-PSNR compared to the VTM-14.2 baseline on average.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here