AltMM'18- Proceedings of the 3rd International Workshop on Multimedia Alternate Realities


SESSION: Keynote Address

Session details: Keynote Address

  •      Francesca De Simone

Augmented Human: Augmented Reality and Beyond

  •      Woontack Woo

Will Augmented Reality (AR) allow us to access digital information, experience others' stories, and thus explore alternate realities? AR has recently attracted attention again due to the rapid advances in related multimedia technologies as well as various glass-type AR display devices. However, in order to widely adopt AR in alternated realities, it is necessary to improve various core technologies and integrate them into an AR platform. Especially, there are several remaining technical challenges such as 1) real-time recognition and tracking of multiple objects while generating an environment map, 2) organic user interface with awareness of user's implicit needs, intentions or emotion as well as explicit requests, 3) immersive multimodal content augmentation and just-in-time information visualization, 4) multimodal interaction and collaboration during augmented telecommunication, etc. In addition, in order to encourage user engagement and enable an AR ecosystem, AR standards should be established that support creating AR content, capturing user experiences, and sharing the captured experiences.

SESSION: Design of Virtual Collaborative Experiences

Session details: Design of Virtual Collaborative Experiences

  •      Pablo Cesar

Quantifying Group Navigation Experience in Collaborative Augmented Virtuality Tours

  •      Shanthi Vellingiri
  • Balakrishnan Prabhakaran

Unnatural, or "magic" navigation technique often exhibit more desirable characteristics than the natural navigation technique" [1]. In this paper, we evaluate the effectiveness of Natural (based on real-world gesture) and Unnatural (based on an event generated from an external device) navigation techniques in offering group navigation experience in a collaborative Augmented Virtuality (AV) tour. AV is a subcategory of Mixed Reality (MR); it merges real-world objects and their actions to a virtual world. This feature of AV facilitates visual recognition of participants and their actions during a tour. In addition, by monitoring participants' actions and virtually accommodating them, a virtual group can stay spatially closer to each other during a tour; we term this as navigating-as-a-group".

As user participation can be from several sites, network impairments among the participating sites can introduce intermittent navigation. As a result, a virtual group might appear virtually spread-out and not close-knit. Therefore, selection of a simple navigation technique that offers a better navigating-as-a-group experience in the midst of network impairments is essential. In this context, (i) with the help of the Hierarchical Position Discrepancy Model (HPDM) proposed in [2], we quantify the impacts of network delay on Natural and Unnatural navigation techniques, and (ii) evaluate the user experience scores with the results of HPDM to suggest the most preferred navigation technique for a group AV tour. Results on a 3-site, 3-user group AV tour shows that "unnatural navigation" technique is better in providing navigating-as-a-group experience.

Avatar: Enabling Immersive Collaboration via Live Mobile Video

  •      Sudheesh Singanamalla
  • William Thies
  • Colin Scott

Live mobile video streaming is gaining popularity around the world, thanks in part to increasingly pervasive smartphone ownership and cellular broadband coverage. We see an opportunity for mobile video streaming to enable new task-oriented experiences such as remote shopping, virtual interactive tourism, auditing and verification, mobile crowdsourcing, and remote physical-world games. We posit that such applications may eventually lead to new employment opportunities for remote agents in developing countries. In this paper we report our experiences conducting a technology probe for one such use case: an 'escape-the-room' physical puzzle game where some of the team members remotely interacted with a video stream that was produced by other team members who were physically present in the escape room. We designed and built a mobile streaming system called Avatar which we deployed in our study with 26 participants. We report findings from our study, including observations about appropriate communication modalities for remote collaborative game playing, as well as unexpected interactions and points of friction between participants.

SESSION: Innovative Display and Interaction Techniques

Session details: Innovative Display and Interaction Techniques

  •      Prabhakaran Balakrishnan

Refocusing Supports of Panorama Light-Field Images in Head-Mounted Virtual Reality

  •      Yu-Ming Lai
  • Cheng-Hsin Hsu

Virtual Reality (VR) has been deeply researched in recent years, although the novel technology delivers some astonishing experience, it may also cause people the sense of imbalance, dizziness, and vomiting when wearing head-mounted displays (HMDs) after a long period of time. In this paper, we combine the light field technology into a novel panorama viewing system implemented on Unity to provide dynamic depth of field and natural focus effect for better user experience. The system is also coped with FOVE, an HMD with eye-tracking technology, to better track the eye movements and collect the gazing parameters, e.g., gazing point, for better user experience.With such a design, a more immersive panorama image/video viewing experience is delivered. An evaluation is held for the user experience, which shows that our proposed system reduces the refocusing time by up to 319 times and increases the subjective Mean Opinion Score (MOS) by 19%.

2.5D Interaction Space: Elevating 2D Display with an Over-the-Top Projection Mapping

  •      Hyundo Kim
  • Hokyoung Ryu
  • Hyemin Lee
  • Jieun Kim

This paper introduces a new 2.5D interaction space by a touch-sensitive screen along with over-the-top projection mapping. Indeed, users can simultaneously interact with virtual objects in the touch-sensitive screen and move the virtual objects with real objects (or body fingers or hands) in the elevated 2.5D space projected by the over-the-top projector in the touch-sensitive screen. Two 2D tracking devices are used to trace the real objects. Using the tracking devices, the states (position, rotation, size) of the real objects are calculated. The realistic images are then projection-mapped to the real objects using the projector installed on the top of the touch-sensitive display. We discuss three interaction mechanisms to be employed in the 2.5D interaction space: Poking, Swaying, and Rotating. A working prototype with the interaction mechanism in the 2.5D interaction space was proposed, and a large scale usability testing is being planned.

SESSION: Invited Talk

Session details: Invited Talk

  •      Wei Tsang Ooi

Measuring User Quality of Experience in Social VR systems

  •      Francesca De Simone

Virtual Reality (VR) is a computer-generated experience that can simulate physical presence in real or imagined environments [7]. A social VR system is an application that allows multiple users to join a collaborative Virtual Environment (VE), such as a computer-generated 3D scene or a 360-degree natural scene captured by an omnidirectional camera, and communicate with each other, usually by means of visual and audio cues. Each user is represented in the VE as a computer-generated avatar [3] or, in recently proposed systems, with a virtual representation based on live captures [1]. Depending on the system, the user' virtual representation can also interact with the virtual environment, for example by manipulating virtual objects, controlling the appearance of the VE, or controlling the playout of additional media in the VE. The interest for social Virtual Reality (VR) systems dates back to the late 90s [4, 8] but has recently increased [2, 5, 6] due to the availability of affordable head-mounted displays on the consumer market and to the appearance of new applications, such as Facebook Spaces, YouTube VR, Hulu VR, which explicitly aim at including social features in existing VR platforms for multimedia delivery. In this talk, we will address the problem of measuring user Quality of Experience (QoE) in social VR systems. We will review the studies that have analysed how different features of a social VR system design, such as avatar appearance and behavioural realism, can affect user's experience, and propose a comparison of the objective and subjective measures used in the literature to quantify user QoE in social VR. Finally, we will discuss the use case of watching movies together in VR and present the results of one of our recent studies focusing on this scenario, designed and performed in the framework of the European project VRTogether (http://vrtogether.eu). Particularly, we show an analysis of correlation between the objective and subjective measurements collected during our study, to provide guidelines toward the design of a unified methodology to monitor and quantify users' QoE in social VR systems. The open questions to be addressed in the future in order to achieve such goal are also discussed.