Semantic Sensing and Communications for Ultimate Extended Reality

16 Dec 2022  ·  BoWen Zhang, Zhijin Qin, Yiyu Guo, Geoffrey Ye Li ·

As a key technology in metaversa, wireless ultimate extended reality (XR) has attracted extensive attentions from both industry and academia. However, the stringent latency and ultra-high data rates requirements have hindered the development of wireless ultimate XR. Instead of transmitting the original source data bit-by-bit, semantic communications focus on the successful delivery of semantic information contained in the source, which have shown great potentials in reducing the data traffic of wireless systems. Inspired by semantic communications, this article develops a joint semantic sensing, rendering, and communication framework for wireless ultimate XR. In particular, semantic sensing is used to improve the sensing efficiency by exploring the spatial-temporal distributions of semantic information. Semantic rendering is designed to reduce the costs on semantically-redundant pixels. Next, semantic communications are adopted for high data transmission efficiency in wireless ultimate XR. Then, two case studies are provided to demonstrate the effectiveness of the proposed framework. Finally, potential research directions are identified to boost the development of semantic-aware wireless ultimate XR.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here