Project QUEST: QoE evaluations for multi-cloud streaming
The QEST project aims to explore the limits of immersive streaming by examining how point cloud compression impacts user experience on two of the most advanced head-mounted displays currently available: the Apple Vision Pro and the Meta Quest 3. To do this, we first extend the SPIRIT testing platform to support these next-generation devices, enabling subjective evaluations of point clouds. In parallel, we introduce an adaptive streaming algorithm designed to handle multiple point clouds in real time within the same 3D scene. By adjusting the Level of Detail for each point cloud according to the user’s field of view, network bandwidth, and device capabilities, the algorithm ensures a seamless experience while staying within technical constraints. QEST will also make available a public dataset of 3D tourist sites. Altogether, by combining algorithmic innovation with platform expansion, QEST strengthens the SPIRIT infrastructure’s scalability and compatibility.
Budget: 100.000,00 €
Funding: European Union (SPIRIT,101070672).
Duration: March 2025 – October 2025
Research Area: Immersive Multimedia and Metaverse
Principal Investigator: Simone Porcu
Department of Electrical and Electronic Engineering