Human-Computer Interaction

Interpersonal Synchronization in Large-scale Virtual Environments


This project is already completed.

Background

Large-scale Virtual Environments (VEs) enable the simultaneous exploration of digital artifacts, like industrial prototypes or drafts of houses, by multiple users wearing Head-Mounted Displays (HMDs). In contrast to CAVE setups, the interpersonal communication is restricted, since the users cannot see each other. The incorporation of virtual avatars counteracts this restriction but requires sensors to detect and virtually replicate gestures and social signals.

In the course of this project, non-obstructive bluetooth acceleration sensors shall be attached to the users’ limbs. The data acquired by these sensors shall be analyzed using machine learning techniques to enable the user to trigger a small set of predefined animation sequences of her avatar, similar to emotes in video games.

Tasks

Conditions

The large scale tracking system of the Fraunhofer IIS in Nürnberg will be used. This requires participants either to stay in Nürnberg or to commute there for about 3 to 4 weeks.

Partners

Fraunhofer IIS
Tobias Feigl
Dr. Stephan Otto
Dr.-Ing. Christopher Mutschler


Contact Persons at the University Würzburg

Daniel Roth (Primary Contact Person)
HCI, Würzburg University
daniel.roth@uni-wuerzburg.de

Legal Information