IXR '23: Proceedings of the 2nd International Workshop on Interactive eXtended Reality

IXR '23: Proceedings of the 2nd International Workshop on Interactive eXtended Reality

IXR '23: Proceedings of the 2nd International Workshop on Interactive eXtended Reality


Full Citation in the ACM Digital Library

SESSION: Keynote Talk 1

Towards Volumetric Video Realism in Extended Reality: Challenges and Opportunities

  • Wei Tsang Ooi

Advances in volumetric capture, compression, and rendering techniques have enabled the possibilities of telepresence in an extended reality (XR) environment. Live or pre-recorded volumetric video of an avatar can be streamed over the network and rendered at a client's XR environment, creating an illusion of spatial co-presence. In this keynote talk, I will first present a case for the importance of visual realism of volumetric video in such scenarios. I will then present existing approaches toward higher visual realism in volumetric video, dividing them into two categories: (i) approaches to achieve smoother motion through temporal up-sampling and (ii) approaches to obtain better details through spatial up-sampling. The former aims to achieve a rendering frame rate that is close to what human brains perceive; while the latter allows users to move closer to an avatar without losing its realism. The talk will also outline the trade-offs and limitations of the current up-sampling approaches. I will conclude the talk with my personal view on the research challenges and opportunities that the research community should confront to achieve a true-to-life XR experience.

SESSION: Technical Session

Towards Optimising Transport Protocols on the 5G Edge for Mobile Augmented Reality

  • Jacky Cao
  • Xiang Su
  • Pan Hui

Mobile augmented reality (MAR) achieves immersive real-time experiences by offloading computation-intensive computer vision tasks. 5G NR (5G) networks and edge computing enable optimised latency and enhanced throughput for MAR offloading. However, efficiently leveraging the 5G edge also requires optimising the data link between MAR devices and server machines. We report preliminary experiments of optimising transport protocols (i.e., UDP and TCP) to understand how we could further modify the data link between MAR clients and servers. Preliminary analysis shows that when using a real-world 5G testbed, our evaluation indicates that for our settings, TCP with default configuration parameters has the lowest round-trip time on 5G, with a median of 15.8\pm10.3 ms. Then by increasing the protocol buffer sizes to 100 KB and 1000 KB, the packet latency and jitter decrease while throughput increases for some media resolutions. We further discuss optimising transport protocol parameters could improve the service quality and experience for MAR applications.

A Haptic-enabled, Distributed and Networked Immersive System for Multi-User Collaborative Virtual Reality

  • Sam Van Damme
  • Fangio Van de Velde
  • Mohammad Javad Sameri
  • Filip De Turck
  • Maria Torres Vega

Virtual Reality (VR) is gaining attention in various domains such as entertainment, industry, mental healthcare and VR training. Al- though most of these use-cases are still limited to single-user tasks, a lot of applications are heavily depending on multi-user collaboration. Existing multi-user VR systems are most often created in a classic server-client architecture, however, which induces unpredictable network behaviour which can affect the end-user's Quality-of-Experience (QoE) and performance. In addition, the interaction methods in these systems are often constrained to either traditional VR controllers or very use-case specific interaction methods, such that general purpose haptic gloves form a somewhat under-explored part of literature. Therefore, we (i) present a networked, distributed multi-user VR system with synchronization of environments over a low-bandwidth networked connection. In addition, we (ii) enhance the experience by adding haptic gloves to the system, which we compare to the traditional VR controllers in a subjective experiment. As a proof-of-concept, a use case is implemented in which two users have to prepare and bake a virtual pizza. The results show that high framerates (> 90 Frames Per Second (FPS)) can be obtained while keeping network throughput to a minimum ( < 1 Mbps). The accompanying user study shows that haptic gloves are preferred when immersiveness is the main emphasis of the virtual environment, while controllers are more suited when performance is in the center of attention. In objective terms, the applicability of haptic feedback is highly dependent on the task at hand.

Subjective Quality Evaluation of Point Clouds using Remote Testing

  • Ashutosh Singla
  • Shuang Wang
  • Steve Göring
  • Rakesh Rao Ramachandra Rao
  • Irene Viola
  • Pablo Cesar
  • Alexander Raake

Subjective quality assessment serves as a method to evaluate the perceptual quality of 3D point clouds. These evaluations can be conducted using lab-based or remote or crowdsourcing tests. The lab-based tests are time-consuming and less cost-effective. As an alternative, remote or crowd tests can be used, offering a time and cost-friendly approach. Remote testing enables larger and more diverse participant pools. However, this raises the question of its applicability due to variability in participants' display devices and environments for the evaluation of the point cloud. In this paper, the focus is on investigating the applicability of remote testing by using the Absolute Category Rating (ACR) test method for assessing the subjective quality of point clouds in different tests. We compare the results of lab and remote tests by replicating lab-based tests. In the first test, we assess the subjective quality of a static point cloud geometry for two different types of geometrical degradations, namely Gaussian noise, and octree-pruning. In the second test, we compare the performance of two different compression methods (G-PCC and V-PCC) to assess the subjective quality of coloured point cloud videos. Based on the results obtained using correlation and Standard deviation of Opinion Scores (SOS) analysis, the remote testing paradigm can be used for evaluating point clouds.

Correlation between Entropy and Prediction Error in VR Head Motion Trajectories

  • Silvia Rossi
  • Laura Toni
  • Pablo Cesar

The general understanding of user behaviour has been often overlooked in the field of Virtual Reality (VR) and Extended Reality (XR) at large. In this work, we want to fill this gap by exploring the relationship between the way in which users navigate in immersive content and the predictability of their trajectories. Inspired by works from social science, our key assumption is that there are navigation trajectories that can be accurately predicted, while others exhibit eclectic patterns that are more challenging to anticipate. However, it is not yet clear how to effectively distinguish between these behaviours. In this context, we conduct an extensive data analysis across multiple datasets investigating users' movements in VR. The ultimate goal is to understand if a specific metric from information theory, such as the entropy of trajectory, can be adopted as a discriminating metric between predictable navigation trajectories and unpredictable ones. Our findings reveal that users with highly regular navigation styles tend to exhibit lower entropy, indicating higher predictability of their movements. Conversely, users with more diverse navigation patterns show higher entropy and lower predictability in their trajectories. Answering the question "how can we distinguish users more predictable than others?'' would be crucial for different purposes in future immersive applications such as enabling new modalities for live streaming services but also for the design of more personalised and engaging VR experiences.

SESSION: Keynote Talk 2

Multisensory Immersive Experiences: From Monitoring of Human Influential Factors to New Applications in Healthcare

  • Tiago Henrique Falk

While virtual and extended reality applications are on the rise, existing experiences are not fully immersive, as only two senses (audio-visual) are typically stimulated. In this keynote talk, I will describe our ongoing work on developing multisensory immersive experiences, which combine auditory, visual, olfactory, and haptic/somatosensory stimuli. I will show the impact that stimulating more senses can have on user quality of experience, sense of presence and immersion, and engagement levels.

Moreover, with multisensory experiences, monitoring human influential factors is crucial, as the perception of sensory stimuli can be very subjective (e.g., while a smell can be pleasant for some, it can be unpleasant for others). To this end, I will also describe our work on instrumenting virtual reality headsets with biosensors to allow not only for automated (remote) monitoring of human behaviour and tracking of human influential factors, but to also develop new markers of user experience, such as a multimodal time perception metric or a cybersickness metric.

Lastly, I will describe some new applications of multisensory experiences that we are developing for healthcare and well-being. I will start with the use of immersive multisensory nature walks for mental health and describe two ongoing projects, one with patients with post-traumatic stress disorder and another with nurses suffering from burnout. I will conclude with a description of the use of multisensory priming for motor-imagery based neurorehabilitation for stroke survivors.