Human-Computer Interaction
HCI in first place in the CSR ranking
The Chair of Human-Computer Interaction at JMU achieves top stats in the international Computer Science Ranking in the research field of Virtual Reality: 1st place in Germany, 3rd place in Europe, 6th place worldwide.
XR Hub @ "XR meets Health“
On November 14th, 2023 the XR Hub will be presenting at this online event
HCI and PIIS Research Demos Week
The HCI and the PIIS Group hosted three demo sessions on the 17th, 18th and 20th of October. We welcomed international visitors from University of Bergen (Norway), the University of Valle (Colombia), and visitors from the 'Arbeitskreis A der Deutschen Rentenversicherung' (Germany).
Multimodal Interfaces SS 2023 Projects
This summer, during the module Multimodal Interfaces, teams of students developed a multimodal speech and gesture interface for a VR application.
3D User Interfaces SS 2023 Projects
This summer, during the module 3D User Interfaces, teams of students developed VR Parkour-style games.
Show more

Open Positions

Student Worker for ViLeArn more
Unity development and research support
Student Workers for the VHB courses
We are looking for student workers to help develop and administer two VHB online courses
Wissenschaftliche:r Mitarbeiter:in (m/w/d) für AIL AT WORK Projekt gesucht
Wir haben eine offene Stelle im wissenschaftlichen Dienst für das AIL AT WORK Projekt.
Student Workers for CoTeach Project
Unity development and research support
Open Research and PhD Position (TVL E13 100%)
The HCI-group has an open position for a research assistant (and PhD candidate) in the general area of interactive systems and related research projects, e.g., VR, AR, avatars, or multimodal interfaces.
Student Workers for the CoTeach Project
We are looking for student workers to help develop and investigate fully immersive learning environments


Recent Publications

Michael F. Clements, Larissa Brübach, Jessica Glazov, Stephanie Gu, Rahila Kashif, Caroline Catmur, Alexandra L. Georgescu, Measuring trust with the Wayfinding Task: Implementing a novel task in immersive virtual reality and desktop setups across remote and in-person test environments, In PLOS ONE. 2023.
[BibSonomy]
@article{clements2023measuring, author = {Michael F. Clements and Larissa Brübach and Jessica Glazov and Stephanie Gu and Rahila Kashif and Caroline Catmur and Alexandra L. Georgescu}, journal = {PLOS ONE}, year = {2023}, title = {Measuring trust with the Wayfinding Task: Implementing a novel task in immersive virtual reality and desktop setups across remote and in-person test environments} }
Abstract:
Kathrin Gemesi, Nina Döllinger, Natascha-Alexandra Weinberger, Wolf Erik, David Mal, Carolin Wienrich, Claudia Luck-Sikorski, Erik Bader, Christina Holzapfel, Auswirkung von (virtuellen) Körperbildübungen auf das Ernährungsverhalten von Personen mit Adipositas – Ergebnisse der ViTraS-Pilotstudie, In Adipositas - Ursachen, Folgeerkrankungen, Therapie, Vol. 17(03), pp. S10-05. 2023.
[Download] [BibSonomy] [Doi]
@article{gemesi2023auswirkung, author = {Kathrin Gemesi and Nina Döllinger and Natascha-Alexandra Weinberger and Wolf Erik and David Mal and Carolin Wienrich and Claudia Luck-Sikorski and Erik Bader and Christina Holzapfel}, journal = {Adipositas - Ursachen, Folgeerkrankungen, Therapie}, number = {03}, url = {http://www.thieme-connect.com/products/ejournals/abstract/10.1055/s-0043-1771568}, year = {2023}, pages = {S10-05}, title = {Auswirkung von (virtuellen) Körperbildübungen auf das Ernährungsverhalten von Personen mit Adipositas – Ergebnisse der ViTraS-Pilotstudie} }
Abstract: Einleitung Die multimodale Adipositastherapie besteht aus Elementen der Ernährung, der Bewegung und des Verhaltens bzw. des Körperbildes. Anwendungen der virtuellen Realität (VR) können das Methodenspektrum der Körperbildtherapie erweitern. Im vorliegenden Projekt wurde ein VR-System (bestehend aus einem Avatar und einem virtuellen Spiegel) zur Verbesserung von Körperwahrnehmung und Körperbild bei Personen mit Adipositas entwickelt. Methoden Im Rahmen der multizentrischen kontrollierten ViTraS-Pilotstudie (Registrierungsnummer: DRKS00027906) wurden bei Personen mit Adipositas virtuelle bzw. traditionelle Körperbildübungen angewendet. Es fanden drei Sitzungen im Abstand von ca. zwei Wochen statt. Über die Zeit wurden anthropometrische Daten sowie Daten zum Ernährungsverhalten (Dutch Eating Behavior Questionnaire (DEBQ) und eine Auswahl von Fragen des The Eating Motivation Survey (TEMS)) erfasst. In zwei Sitzungen wurden Körperbildübungen virtuell (VR-Gruppe) oder nicht-virtuell (Kontrollgruppe) durchgeführt. Eine online Follow-Up-Datenerhebung fand 6 Wochen nach der letzten Sitzung statt. Ergebnisse Insgesamt wurden 66 Personen (VR-Gruppe: 31, Kontrollgruppe: 35) in die Studie eingeschlossen. Die Personen waren zu 79% (52/66) weiblich, 45±13 Jahre alt und der mittlere BMI lag bei 37±4 kg/m2. Die Auswertung des DEBQ Fragebogens hat ergeben, dass in der VR-Gruppe gezügeltes Essverhalten von Anfang bis Ende der Studie signifikant (p<0,05) zugenommen hat. Die Ergebnisse der Auswahl von Fragen des TEMS hat keine signifikanten (p≥0,05) Unterschiede innerhalb oder zwischen den Gruppen ergeben. Schlussfolgerung Die Studie hat ergeben, dass die Durchführung von virtuellen Körperbildübungen eine Auswirkung auf das Ernährungsverhalten von Personen mit Adipositas haben kann.
Vivek Nair, Christian Rack, Wenbo Guo, Rui Wang, Shuixian Li, Brandon Huang, Atticus Cull, James F. O'Brien, Marc Latoschik, Louis Rosenberg, Dawn Song, Inferring Private Personal Attributes of Virtual Reality Users from Head and Hand Motion Data. 2023. preprint
[Download] [BibSonomy] [Doi]
@misc{nair2023inferring, author = {Vivek Nair and Christian Rack and Wenbo Guo and Rui Wang and Shuixian Li and Brandon Huang and Atticus Cull and James F. O'Brien and Marc Latoschik and Louis Rosenberg and Dawn Song}, url = {https://arxiv.org/abs/2305.19198}, year = {2023}, title = {Inferring Private Personal Attributes of Virtual Reality Users from Head and Hand Motion Data} }
Abstract: Motion tracking "telemetry" data lies at the core of nearly all modern virtual reality (VR) and metaverse experiences. While generally presumed innocuous, recent studies have demonstrated that motion data actually has the potential to uniquely identify VR users. In this study, we go a step further, showing that a variety of private user information can be inferred just by analyzing motion data recorded from VR devices. We conducted a large-scale survey of VR users (N=1,006) with dozens of questions ranging from background and demographics to behavioral patterns and health information. We then obtained VR motion samples of each user playing the game "Beat Saber," and attempted to infer their survey responses using just their head and hand motion patterns. Using simple machine learning models, over 40 personal attributes could be accurately and consistently inferred from VR motion data alone. Despite this significant observed leakage, there remains limited awareness of the privacy implications of VR motion data, highlighting the pressing need for privacy-preserving mechanisms in multi-user VR applications.
Christian Rack, Lukas Schach, Marc Latoschik, Motion Learning Toolbox – A Python library for preprocessing of XR motion tracking data for machine learning applications. 2023.
[Download] [BibSonomy]
@misc{rack2023motionlearningtoolbox, author = {Christian Rack and Lukas Schach and Marc Latoschik}, url = {https://github.com/cschell/Motion-Learning-Toolbox}, year = {2023}, title = {Motion Learning Toolbox – A Python library for preprocessing of XR motion tracking data for machine learning applications} }
Abstract: The Motion Learning Toolbox is a Python library designed to facilitate the preprocessing of motion tracking data in extended reality (XR) setups. It's particularly useful for researchers and engineers wanting to use XR tracking data as input for machine learning models. Originally developed for academic research targeting the identification of XR users by their motions, this toolbox includes a variety of data encoding methods that enhance machine learning model performance.
Franziska Westermeier, Larissa Brübach, Carolin Wienrich, Marc Erich Latoschik, A Virtualized Augmented Reality Simulation for Exploring Perceptual Incongruencies, In Proceedings of the 29th ACM Symposium on Virtual Reality Software and Technology. New York, NY, USA: Association for Computing Machinery, 2023.
[Download] [BibSonomy] [Doi]
@inproceedings{westermeier2023virtualized, author = {Franziska Westermeier and Larissa Brübach and Carolin Wienrich and Marc Erich Latoschik}, url = {https://doi.org/10.1145/3611659.3617227}, year = {2023}, booktitle = {Proceedings of the 29th ACM Symposium on Virtual Reality Software and Technology}, publisher = {Association for Computing Machinery}, address = {New York, NY, USA}, series = {VRST '23}, title = {A Virtualized Augmented Reality Simulation for Exploring Perceptual Incongruencies} }
Abstract: When blending virtual and physical content, certain incongruencies emerge from hardware limitations, inaccurate tracking, or different appearances of virtual and physical content. They restrain us from perceiving virtual and physical content as one experience. Hence, it is crucial to investigate these issues to determine how they influence our experience. We present a virtualized augmented reality simulation that can systematically examine single incongruencies or different configurations.
Sooraj K. Babu, Tobias Brandner, Samuel Truman, Sebastian von Mammen, Investigating Crowdsourced Help Facilities for Enhancing User Guidance, In Nuria Pelechano, Fotis Liarokapis, Damien Rohmer, Ali Asadipour (Eds.), International Conference on Interactive Media, Smart Systems and Emerging Technologies (IMET). The Eurographics Association, 2023.
[Download] [BibSonomy] [Doi]
@inproceedings{10.2312:imet.20231252, author = {Sooraj K. Babu and Tobias Brandner and Samuel Truman and Sebastian von Mammen}, url = {https://diglib.eg.org/handle/10.2312/imet20231252}, year = {2023}, booktitle = {International Conference on Interactive Media, Smart Systems and Emerging Technologies (IMET)}, editor = {Nuria Pelechano and Fotis Liarokapis and Damien Rohmer and Ali Asadipour}, publisher = {The Eurographics Association}, title = {Investigating Crowdsourced Help Facilities for Enhancing User Guidance} }
Abstract: We present two help facilities aimed at addressing the challenges faced by users new to a software. Software documentation should ideally familiarise with the overall functionality, navigate through the concrete workflows, and opportunities for customisation. Large degrees of freedom in the use and vast basic scopes, however, often make it infeasible to convey this information upfront. Rather, it needs to be broken into helpful bits and pieces which are delivered at appropriate times. In order to address this challenge, we have designed two concrete help facilities, i.e. tooltips and tips-of-the-day, to feature crowdsourced information. In this paper, we present the results of a formative study to implement early proofs-of-concept for the open-source game authoring platform Godot. To support further research, we have made the code base publicly available on GitHub.
Nina Döllinger, Matthias Beck, Erik Wolf, David Mal, Mario Botsch, Marc Erich Latoschik, Carolin Wienrich, “If It’s Not Me It Doesn’t Make a Difference” – The Impact of Avatar Personalization on User Experience and Body Awareness in Virtual Reality, In 2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). 2023.
[Download] [BibSonomy] [Doi]
@inproceedings{dollinger2023doesnt, author = {Nina Döllinger and Matthias Beck and Erik Wolf and David Mal and Mario Botsch and Marc Erich Latoschik and Carolin Wienrich}, url = {https://downloads.hci.informatik.uni-wuerzburg.de/2023-ismar-impact-of-avatar-appearance-preprint.pdf}, year = {2023}, booktitle = {2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)}, title = {“If It’s Not Me It Doesn’t Make a Difference” – The Impact of Avatar Personalization on User Experience and Body Awareness in Virtual Reality} }
Abstract: Body awareness is relevant for the efficacy of psychotherapy. However, previous work on virtual reality (VR) and avatar-assisted therapy has often overlooked it. We investigated the effect of avatar individualization on body awareness in the context of VR-specific user experience, including sense of embodiment (SoE), plausibility, and sense of presence (SoP). In a between-subject design, 86 participants embodied three avatar types and engaged in VR movement exercises. The avatars were (1) generic and gender-matched, (2) customized from a set of pre-existing options, or (3) personalized photorealistic scans. Compared to the other conditions, participants with personalized avatars reported increased SoE, yet higher eeriness and reduced body awareness. Further, SoE and SoP positively correlated with body awareness across conditions. Our results indicate that VR user experience and body awareness do not always dovetail and do not necessarily predict each other. Future research should work towards a balance between body awareness and SoE.
J. Bruschke, C. Kröber, F. Maiwald, R. Utescher, A. Pattee, INTRODUCING A MULTIMODAL DATASET FOR THE RESEARCH OF ARCHITECTURAL ELEMENTS, In The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol. XLVIII-M-2-2023, pp. 325--331. Copernicus GmbH, 2023.
[Download] [BibSonomy] [Doi]
@article{Bruschke_archilabel_2023, author = {J. Bruschke and C. Kröber and F. Maiwald and R. Utescher and A. Pattee}, journal = {The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences}, url = {https://doi.org/10.5194%2Fisprs-archives-xlviii-m-2-2023-325-2023}, year = {2023}, publisher = {Copernicus GmbH}, pages = {325--331}, title = {INTRODUCING A MULTIMODAL DATASET FOR THE RESEARCH OF ARCHITECTURAL ELEMENTS} }
Abstract:
See all publications here
Legal Information