Search Results for author: Ashutosh Singla

Found 2 papers, 1 papers with code

Audiovisual Database with 360 Video and Higher-Order Ambisonics Audio for Perception, Cognition, Behavior, and QoE Evaluation Research

no code implementations27 Dec 2022 Thomas Robotham, Ashutosh Singla, Olli S. Rummukainen, Alexander Raake, Emanuël A. P. Habets

Research into multi-modal perception, human cognition, behavior, and attention can benefit from high-fidelity content that may recreate real-life-like scenes when rendered on head-mounted displays.

AVTrack360: An open Dataset and Software recording people's Head Rotations watching 360° Contents on an HMD

1 code implementation ACM Multimedia Systems Conference 2018 2018 Stephan Fremerey, Ashutosh Singla, Kay Meseberg, Alexander Raake

In case of videos, head-saliency data can be used for training saliency models, as information for evaluating decisions during content creation, or as part of streaming solutions for region-of-interest-specific coding as with the latest tile-based streaming solutions, as discussed also in standardization bodies such as MPEG.

Cannot find the paper you are looking for? You can Submit a new open access paper.