Human-Computer Interaction

GIB MIR


Overview

Multimodal interfaces (MMIs) are a promising alternative human-computer interaction paradigm. They are feasible for a wide rang of environments, yet they are especially suited if interactions are spatially and temporally grounded with an environment in which the user is (physically) situated, like virtual reality, mixed reality, human-robot interaction, and computer games. We use a concurrent Augmented Transition Network (cATN) to implement multimodal interfaces for a broad spectrum of demonstrations and research. It is a successor of the temporal Augmented Transition Network and is implemented in Simulator X.

Demonstrations

Robot Museum - A Multimodal Virtual Reality Game
We are happy to present a demonstration of David Heidrich's scientific internship implemented in summer semester 2018.
Space Tentacle - A Multimodal Adventure Game
We are happy to present a demonstration by Chris Zimmerer and Dr. Martin Fischbach.
Big Bang - A Multimodal VR Universe Builder
We are happy to present a proof of concept demonstration of Chris Zimmerer's master thesis.
Quest V2 Prototype Finished
The Quest V2 prototype is a mixed reality tabletop role-playing game with a novel combination of interaction styles and gameplay mechanics.

For Developers

If you are interested in using the cATN for research or teaching, then contact us. The latest version of our cATN is hosted on our institute’s GitLab server together with additional material that we use for teaching, such as how-tos and practical exercises. Getting access is as easy as sending your type of interest as well as a mail address for registration to one of the primary contact persons linked at bottom of this page.

News

Best Paper Nominee at ICMI‘20
This year's joint contribution of the HCI chair and the Psychological Ergonomics chair 'Finally on Par?! Multimodal and Unimodal Interaction for Open Creative Design Tasks in Virtual Reality' was nominated for Best Paper. Congratulations to Erik, Sara, Chris, Martin, Jean-Luc, and Marc!
Multimodal Interfaces Results of SS 2020
The semester comes to an end and we are happy to present the results of the Multimodal Interfaces course.
Best Paper Runner-Up at ICMI‘19
This year's joint contribution of the HCI chair and the Psychological Ergonomics chair 'Paint that object yellow: Multimodal Interaction to Enhance Creativity During Design Tasks in VR' was awarded as Best Paper Runner-Up. Congratulations to Erik, Sara, Chris, Jean-Luc, and Marc!
Multimodal Interaction Course Results of SS 2019
The semester comes to an end and we are happy to present the results of the Multimodal Interaction course.
HCI Group published in the Multimodal Technologies and Interaction Journal of the MDPI
Chris Zimmerer, Dr. Martin Fischbach and Prof. Dr. Marc Erich Latoschik published their recent work on the development of multimodal interfaces.
Multimodal Interaction Course Results of SS 2018
The semester comes to an end and we are happy to present the results of the Multimodal Interaction course.
Robot Museum - A Multimodal Virtual Reality Game
We are happy to present a demonstration of David Heidrich's scientific internship implemented in summer semester 2018.
Space Tentacle - A Multimodal Adventure Game
We are happy to present a demonstration by Chris Zimmerer and Dr. Martin Fischbach.
Module and Project Results of WS 2017/18
The semester comes to an end and we are happy to present some great results of lecture modules and projects.
HCI Group Presents 1 Journal, 4 Conference and 1 SEARIS Paper and 4 Posters at the IEEE VR 2018 in Reutlingen
This year our group is strongly represented at the IEEE VR 2018, the largest VR conference of the research community.
German-Japanese Spring School on Human Factors 2018
Here you find Japanese descriptions of our research demos for the German-Japanese Spring School on Human Factors 2018 in collaboration of Psychological Ergonomics and HCI.
Module and Project Results of SS 2017
The semester comes to an end and we are happy to present some great results of lecture modules and projects.
Public PhD Thesis Defense by Martin Fischbach
Martin Fischbach publicly defenses his thesis at 4. August 2017, 10:00 in Z6, SE 1.010
Module and Project Results of WS 2016/17
The semester comes to an end and we are happy to present some great results of lecture modules and projects.
Big Bang - A Multimodal VR Universe Builder
We are happy to present a proof of concept demonstration of Chris Zimmerer's master thesis.
Module, Project and Thesis Results of SS 2016
The semester comes to an end and we are happy to present some great results of lecture modules, projects, and thesis.
Module, Project and Thesis Results of WS 2015/16
The semester comes to an end and we are happy to present some great results of lecture modules, projects, and thesis.
Quest V2 Prototype Finished
The Quest V2 prototype is a mixed reality tabletop role-playing game with a novel combination of interaction styles and gameplay mechanics.
XRoads Demo at the Mobile Media Day
Berit Barkschat and Sascha Link present the latest results of the ongoing project XRoads at the Mobile Media Day in Würzburg.
Presentation at the ICMI '15 in Seattle
Martin Fischbach will present an extended exposé of his PhD thesis on software techniques for multimodal input processing in Realtime Interactive Systems at the International Conference on Multimodal Interaction in Seattle.

Theses and projects

Assigned

Show Me Your Moves
Recording, playback, and analysis of human performance in virtual environments (VEs) is an important foundation facilitating systems that respond appropriately to (un)intentional non-verbal human actions. This project targets a general solution for recording time series of entity property changes within Unity and its application to generate artificial training data for image-based human pose detection as well as to the machine-learning-supported recognition of non-verbal human object references in a VE.

Rene Stingl

Do you mean me?
The goal of this HCI thesis is to investigate and compare a novel interaction technique for determining a user's nonverbal deixis with ray-casting in the context of a multimodal speech and gesture interface in VR.

Closed

Sascha Link

Comparison of multimodal fusion methods for real-time interactive systems - towards a general testbed
This thesis proposes a testbed for comparative evaluations of fusion engines and compares two different fusion approaches as proof of concept: temporal augmented transition networks vs. unification.

Claudia Mehn

Gesture Recognition and Feature Selection
Tracking systems usually extract the position of body joints and pass them on to a final decision unit which classifies the data as different gestures. These units can be manually specified templates.
Conception and Development of a Multimodal VR Adventure Game
The aim of this project is to design and implement a multimodal VR adventure game.

Jonas Müller

Adaptive Distributional Word Models for Robust Semantic Information Systems
This HCI master thesis focuses on distributional word models for robust semantic information systems.

Rene Stingl

Natural Pointing
The goal of this HCI project is to develop a technique for selecting objects which focuses on the naturalness of the interaction.

Ronja Heinrich

Natural Multimodal Speech and Gesture Interfaces for Virtual Reality - Towards Practical Design Guidelines
The goal of this thesis is to contribute to the body of research and a step towards practical design guidelines for multimodal interfaces.

Team

Publications

Legal Information