Human-Computer Interaction
Summer EXPO 2025 Recap
The Summer EXPO 2025 for HCI/HCS, CS and GE was a great success! A large number of visitors were able to experience up to 120 different demos and projects.
Summer Expo 2025 Invitation
We invite you to this year's summer EXPO on the 25th of July!
6.5 Million Euros for Research on Digital Storytelling: HCI Group Joins New DFG Research Training Group TESDA
How do immersive technologies change the way we perceive and understand stories? The new DFG Research Training Group TESDA investigates this question. Prof. Marc Erich Latoschik and his team from the HCI Group are significantly involved with three subprojects on storytelling in immersive virtual reality.
AIL AT WORK @ denkbares Networking Day
The AIL team contributed to the event with three demonstrators, two questionnaires, and training material on AI literacy.
Show more

Recent Publications

Christian Merz, Niklas Krome, Carolin Wienrich, Stefan Kopp, Marc Erich Latoschik, The Impact of AI-Based Real-Time Gesture Generation and Immersion on the Perception of Others and Interaction Quality in Social XR, In IEEE Transactions on Visualization and Computer Graphics. 2025. To be published
[BibSonomy]
@article{merz2025impact, author = {Christian Merz and Niklas Krome and Carolin Wienrich and Stefan Kopp and Marc Erich Latoschik}, journal = {IEEE Transactions on Visualization and Computer Graphics}, year = {2025}, title = {The Impact of AI-Based Real-Time Gesture Generation and Immersion on the Perception of Others and Interaction Quality in Social XR} }
Abstract: This study explores how people interact in dyadic social eXtended Reality (XR), focusing on two main factors: the animation type of a conversation partner’s avatar and how immersed the user feels in the virtual environment. Specifically, we investigate how 1) idle behavior, 2) AI-generated gestures, and 3) motion-captured movements from a confederate (a controlled partner in the study) influence the quality of conversation and how that partner is perceived. We examined these effects in both symmetric interactions (where both participants use VR headsets and controllers) and asymmetric interactions (where one participant uses a desktop setup). We developed a social XR platform that supports asymmetric device configurations to provide varying levels of immersion. The platform also supports a modular avatar animation system providing idle behavior, real-time AI-generated co-speech gestures, and full-body motion capture. Using a 2×3 mixed design with 39 participants, we measured users’ sense of spatial presence, their perception of the confederate, and the overall conversation quality. Our results show that users who were more immersed felt a stronger sense of presence and viewed their partner as more human-like and believable. Surprisingly, however, the type of avatar animation did not significantly affect conversation quality or how the partner was perceived. Participants often reported focusing more on what was said rather than how the avatar moved.
Ronja Heinrich, Chris Zimmerer, Martin Fischbach, Marc Erich Latoschik, A Systematic Review of Fusion Methods for the User-Centered Design of Multimodal Interfaces, In Proceedings of the 27th International Conference on Multimodal Interaction (ICMI '25). Association for Computing Machinery, 2025.
[BibSonomy]
@inproceedings{heinrich2025systematic, author = {Ronja Heinrich and Chris Zimmerer and Martin Fischbach and Marc Erich Latoschik}, year = {2025}, booktitle = {Proceedings of the 27th International Conference on Multimodal Interaction (ICMI '25)}, publisher = {Association for Computing Machinery}, title = {A Systematic Review of Fusion Methods for the User-Centered Design of Multimodal Interfaces} }
Abstract: This systematic review investigates the current state of research on multimodal fusion methods, i.e., the joint analysis of multimodal inputs, for intentional, instruction-based human-computer interactions, focusing on the combination of speech and spatially expressive modalities such as gestures, touch, pen, and gaze. We examine 50 systems from a User-Centered Design perspective, categorizing them by modality combinations, fusion strategies, application domains and media, as well as reusability. Our findings highlight a predominance of descriptive late fusion methods, limited reusability, and a lack of standardized tool support, hampering rapid prototyping and broader applicability. We identify emerging trends in machine learning-based fusion and outline future research directions to advance reusable and user-centered multimodal systems.
Andreas Halbig, Marc Erich Latoschik, The Interwoven Nature of Spatial Presence and Virtual Embodiment: A Comprehensive Perspective, In Frontiers in Virtual Reality, Vol. 6. 2025.
[Download] [BibSonomy] [Doi]
@article{halbig-interwoven, author = {Andreas Halbig and Marc Erich Latoschik}, journal = {Frontiers in Virtual Reality}, url = {https://www.frontiersin.org/journals/virtual-reality/articles/10.3389/frvir.2025.1616662/full}, year = {2025}, volume = {6}, doi = {10.3389/frvir.2025.1616662}, title = {The Interwoven Nature of Spatial Presence and Virtual Embodiment: A Comprehensive Perspective} }
Abstract:
Samuel Truman, Sebastian Von Mammen, VIA-VR: A Platform to Streamline the Development of Virtual Reality Serious Games for Healthcare, In 2025 IEEE 38th International Symposium on Computer-Based Medical Systems (CBMS), pp. 463-468. 2025.
[Download] [BibSonomy] [Doi]
@inproceedings{11058787, author = {Samuel Truman and Sebastian Von Mammen}, url = {https://ieeexplore.ieee.org/document/11058787}, year = {2025}, booktitle = {2025 IEEE 38th International Symposium on Computer-Based Medical Systems (CBMS)}, pages = {463-468}, doi = {10.1109/CBMS65348.2025.00098}, title = {VIA-VR: A Platform to Streamline the Development of Virtual Reality Serious Games for Healthcare} }
Abstract:
Sebastian Oberdörfer, Melina Heinisch, Tobias Mühling, Verena Schreiner, Sarah König, Marc Erich Latoschik, Ready for VR? Assessing VR Competence and Exploring the Role of Human Abilities and Characteristics, In Frontiers in Virtual Reality. 2025.
[Download] [BibSonomy]
@article{oberdorfer2025ready, author = {Sebastian Oberdörfer and Melina Heinisch and Tobias Mühling and Verena Schreiner and Sarah König and Marc Erich Latoschik}, journal = {Frontiers in Virtual Reality}, url = {http://downloads.hci.informatik.uni-wuerzburg.de/2025-oberdoerfer-frontiers-vr-competence-preprint.pdf}, year = {2025}, title = {Ready for VR? Assessing VR Competence and Exploring the Role of Human Abilities and Characteristics} }
Abstract: The use of VR for educational purposes provides the opportunity for integrating VR applications into assessments or graded examinations. Interacting with an VR environment requires specific human abilities, thus suggesting the existence of a VR competence. With regard to the emerging field of VR-based examinations, this VR competence might influence a candidate's final grade and hence should be taken into account. In this paper, we proposed and developed a VR competence assessment application. The application features eight individual challenges that are based on generic 3D interaction techniques. In a pilot study, we measured the performance of 18 users. By identifying significant correlations between VR competence score, previous VR experience and theoretically-grounded contributing human abilities and characteristics, we provide first evidence that our VR competence assessment is effective. In addition, we provide first data that a specific VR competence exists. Our analyses further revealed that mainly spatial ability but also immersive tendency correlated with VR competence scores. These insights not only allow educators and researchers to assess and potentially equalize the VR competence level of their subjects, but also help designers to provide effective tutorials for first-time VR users.
Peter Kullmann, Theresa Schell, Mario Botsch, Marc Erich Latoschik, Eye-to-Eye or Face-to-Face? Face and Head Substitution for Co-Located AR, In Justine Saint-Aubert (Eds.), Frontiers in Virtual Reality, Vol. 6. 2025. provisionally accepted
[Download] [BibSonomy] [Doi]
@article{kullmann2025eyetoeye, author = {Peter Kullmann and Theresa Schell and Mario Botsch and Marc Erich Latoschik}, journal = {Frontiers in Virtual Reality}, url = {https://www.frontiersin.org/journals/virtual-reality/articles/10.3389/frvir.2025.1594350/}, year = {2025}, editor = {Justine Saint-Aubert}, volume = {6}, doi = {10.3389/frvir.2025.1594350}, title = {Eye-to-Eye or Face-to-Face? Face and Head Substitution for Co-Located AR} }
Abstract:
S. Müller, A. Müller, S. Truman, T. Buhl, S. von Mammen, T. Brixner, femtoPro: Teaching and Training of Ultrafast Optics in Virtual Reality, In 2025 IEEE Conference on Education and Training in Optics and Photonics (ETOP), pp. 1-4. 2025.
[BibSonomy] [Doi]
@inproceedings{11030702, author = {S. Müller and A. Müller and S. Truman and T. Buhl and S. von Mammen and T. Brixner}, year = {2025}, booktitle = {2025 IEEE Conference on Education and Training in Optics and Photonics (ETOP)}, pages = {1-4}, doi = {10.1109/ETOP64842.2025.11030702}, title = {femtoPro: Teaching and Training of Ultrafast Optics in Virtual Reality} }
Abstract:
André Markus, Astrid Carolus, Carolin Wienrich, Objective Measurement of AI Literacy: Development and Validation of the AI Competency Objective Scale (AICOS). 2025.
[Download] [BibSonomy]
@misc{markus2025objectivemeasurementailiteracy, author = {André Markus and Astrid Carolus and Carolin Wienrich}, url = {https://arxiv.org/abs/2503.12921}, year = {2025}, title = {Objective Measurement of AI Literacy: Development and Validation of the AI Competency Objective Scale (AICOS)} }
Abstract:
See all publications here
Legal Information