Human-Computer Interaction

Conception and Development of a Multimodal VR Adventure Game


This project is already completed.

Background

Computer games like Monkey Island (picture on the left) and Day of the Tentacle (picture in the middle) usually provide a predefined set of actions arrayed on the screen, e.g., “pick up”, “use”, or “talk to”. In order to progress a storyline, players solve puzzles by selecting actions with mouse input. For Virtual Reality (VR), an environment in which the user is (physically) situated, these traditional interaction techniques are either not feasible or can negatively impact a user’s feeling of immersion and/or presence.

Multimodal Interfaces (MMIs) implement human-computer interaction paradigms that center around users’ natural behavior and communication capabilities. Such interfaces combine at least two modalities, e.g., speech and gesture, potentially operating simultaneously (picture on the right). Their potential benefits appear promising for VR applications. However, their VR-related pros and cons are rarely investigated so far which is partly due to the high development expenses.

Tasks

The aim of this project is to develop a multimodal VR adventure game in which the user has to solve puzzles to progress a storyline, e.g, finding a way to escape a malfunctioning spaceship. Special emphasis shall be placed on engaging gameplay, especially on a meaningful use of various interaction techniques for invoking actions and solving the puzzles. On the one hand, a “pick up” action in VR can oftentimes be directly performed by reaching for an object and grabbing it without the need for speech or menu-based interactions. On the other hand, a multimodal interface might be more suited than a menu-based solution, when asking a virtual companion to assist in solving a puzzle. The Space Tentacle demonstration showcases a possible game scenario and the Robot Museum illustrates different types of virtual companions. Teaser videos are linked below.

The tasks can be roughly summarized as follows:

Space Tentacles - A Multimodal VR Adventure Game

Robot Museum

Study Programs

The described tasks can be carried out as a team with up to 4 members from the study programs Human-Computer Interaction, Games Engineering, and Computer Science Students with focus on GE or HCI.

Prerequisites for HCI Students and CS Students with focus on HCI

Prerequisites for GE Students and CS Students with focus on GE

References

Chris Zimmerer, Martin Fischbach, Marc Erich Latoschik, Space Tentacles - Integrating Multimodal Input into a VR Adventure Game, In Proceedings of the 25th IEEE Virtual Reality (VR) conference. IEEE, 2018.

Zimmerer, C.; Fischbach, M.; Latoschik, M.E. Semantic Fusion for Natural Multimodal Interfaces using Concurrent Augmented Transition Networks. Multimodal Technologies Interact. 2018, 2, 81.


Contact Persons at the University Würzburg

Chris Zimmerer (Primary Contact Person)
Human-Computer Interaction, Universität Würzburg
chris.zimmerer@uni-wuerzburg.de

Martin Fischbach
Human-Computer Interaction, Universität Würzburg
martin.fischbach@uni-wuerzburg.de

Sebastian von Mammen
Human-Computer Interaction, Universität Würzburg
sebastian.von.mammen@uni-wuerzburg.de

Marc Erich Latoschik
Human-Computer Interaction, Universität Würzburg
marc.latoschik@uni-wuerzburg.de

Legal Information