Human-Computer Interaction
The University of Würzburg Showcases HCI Student Projects in New WinterExpo 2024/25 Video
Discover three innovative projects by HCI students, highlighted as standout examples in the official WinterExpo 2024/25 video.
Research Project AIL AT WORK Receives Extension
The project will be funded for another 1.5 years by the German Federal Ministry of Labor and Social Affairs.
Winter EXPO 2025 Recap
The Winter EXPO 2024/25 for MCS/HCI was a great success! Thank you to all those involved and to all visitors!
Interviews on Large Language Model DeepSeek R1
Prof. Latoschik gave several interviews about the new large language model DeepSeek R1!
Winter Expo 2025 Invitation
This year's winter expo is on the 07th of February 2025. Feel free to join us and experience a lot of interesting projects.
Show more

Recent Publications

Christian Merz, Marc Erich Latoschik, Carolin Wienrich, Breaking Immersion Barriers: Smartphone Viability in Asymmetric Virtual Collaboration, In CHI 25 Conference on Human Factors in Computing Systems Extended Abstracts, pp. 1-8. 2025. To be published.
[Download] [BibSonomy] [Doi]
@inproceedings{merz2025smartphone, author = {Christian Merz and Marc Erich Latoschik and Carolin Wienrich}, url = {https://downloads.hci.informatik.uni-wuerzburg.de/2025-chilbw-smartphone-asymmetry.pdf}, year = {2025}, booktitle = {CHI 25 Conference on Human Factors in Computing Systems Extended Abstracts}, pages = {1-8}, doi = {10.1145/3706599.3719814}, title = {Breaking Immersion Barriers: Smartphone Viability in Asymmetric Virtual Collaboration} }
Abstract: As demand grows for cross-device collaboration in virtual environments, users increasingly join shared spaces on varying hardware ranging from head-mounted displays (HMDs) to everyday lower-immersion smartphones. This paper investigates smartphone-based participation compared with fully immersive VR in dyadic asymmetric interaction. One participant joins via an HMD, while the other uses a smartphone. Through a collaborative sorting task, we evaluate self-perception (presence, embodiment), other-perception (co-presence, social presence, avatar plausibility), and task-perception (task load, enjoyment). We compare our results with previous work that examined VR-VR and desktop-VR pairings. The results show that smartphone users report lower self-perception than VR users. However, other-perception remains comparable to immersive setups. Interestingly, smartphone participants experience lower mental demand. It appears that device familiarity and intuitive interfaces can compensate for reduced immersion. Overall, our work highlights the viability of smartphones for asymmetric interaction, offering high accessibility without impairing social interaction.
Andrea Bellucci, Giulio Jacucci, Kien Duong, Pritom K Das, Sergei Smirnov, Imtiaj Ahmed, Jean-Luc Lugrin, Immersive Tailoring of Embodied Agents Using Large Language Models, In Proceedings of the The 32nd IEEE Conference on Virtual Reality and 3D User Interfaces. 2025.
[BibSonomy]
@inproceedings{bellucci2025immersive, author = {Andrea Bellucci and Giulio Jacucci and Kien Duong and Pritom K Das and Sergei Smirnov and Imtiaj Ahmed and Jean-Luc Lugrin}, year = {2025}, booktitle = {Proceedings of the The 32nd IEEE Conference on Virtual Reality and 3D User Interfaces}, series = {IEEE VR 2025}, title = {Immersive Tailoring of Embodied Agents Using Large Language Models} }
Abstract: LLM-based embodied agents are recently emerging in VR supporting various scenarios such as pedagogical assistants, virtual companions, and NPCs for games. While they have potential to enhance user interactions, they require careful design to cater unique user needs and contexts. We present an architecture that leverages different LLM modules, and their engineering integration to enable conversational interactions with an embodied agent in multi-user VR. Our system’s primary goal is to facilitate immersive tailoring through conversational input, allowing users to dynamically adjust an agent’s behavior and properties (e.g., role, personality, and appearance), directly within the virtual space, rather than during development or via separate interfaces. We demonstrate this approach with a use case and provide performance measurements in terms of latency of ailoring.
Lena Holderrieth, Erik Wolf, Marie Luisa Fiedler, Mario Botsch, Marc Erich Latoschik, Carolin Wienrich, Do You Feel Better? The Impact of Embodying Photorealistic Avatars with Ideal Body Weight on Attractiveness and Self-Esteem in Virtual Reality, In 2025 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (IEEE VRW), pp. 1404-1405. IEEE Computer Science, 2025. Best Poster 🏆
[Download] [BibSonomy]
@inproceedings{holderrieth2025better, author = {Lena Holderrieth and Erik Wolf and Marie Luisa Fiedler and Mario Botsch and Marc Erich Latoschik and Carolin Wienrich}, url = {https://downloads.hci.informatik.uni-wuerzburg.de/2025-ieeevr-holderrieth-do-you-feel-better.pdf}, year = {2025}, booktitle = {2025 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (IEEE VRW)}, publisher = {IEEE Computer Science}, pages = {1404-1405}, title = {Do You Feel Better? The Impact of Embodying Photorealistic Avatars with Ideal Body Weight on Attractiveness and Self-Esteem in Virtual Reality} }
Abstract: Body weight issues can manifest in low self-esteem through a negative body image or the feeling of unattractiveness. To explore potential interventions, the pilot study examined whether embodying a photorealistically personalized avatar with enhanced attractiveness affects self-esteem. Participants in the manipulation group adjusted their avatar's body weight to their self-defined ideal, while a control group used unmodified avatars. To confirm the manipulation, we measured the perceived avatars' attractiveness. Results showed that participants found avatars at their ideal weight significantly more attractive, confirming an effective manipulation. Further, the ideal weight group showed a clear trend towards higher self-esteem post-exposure.
Marie Luisa Fiedler, Arne Bürger, Sabrina Mittermeier, Mario Botsch, Marc Erich Latoschik, Carolin Wienrich, Evaluating VR and AR Mirror Exposure for Anorexia Nervosa Therapy in Adolescents: A Method Proposal for Understanding Stakeholder Perspectives, In 2025 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (IEEE VRW), pp. 965-970. IEEE Computer Science, 2025.
[Download] [BibSonomy]
@inproceedings{fiedler2025evaluating, author = {Marie Luisa Fiedler and Arne Bürger and Sabrina Mittermeier and Mario Botsch and Marc Erich Latoschik and Carolin Wienrich}, url = {https://downloads.hci.informatik.uni-wuerzburg.de/2025-ieeevr-fiedler-stakeholder-focus-group-proposal.pdf}, year = {2025}, booktitle = {2025 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (IEEE VRW)}, publisher = {IEEE Computer Science}, pages = {965-970}, title = {Evaluating VR and AR Mirror Exposure for Anorexia Nervosa Therapy in Adolescents: A Method Proposal for Understanding Stakeholder Perspectives} }
Abstract: Body image distortions in anorexia nervosa pose significant therapeutic challenges, requiring innovative interventions. Virtual Reality (VR) and Augmented Reality (AR) technologies offer promising solutions, yet stakeholder preferences, from therapists and patients, remain unexplored. This methodological proposal outlines focus groups to compare VR and AR mirror exposures using personalized and body-weight-modifiable avatars in anorexia nervosa therapy. Therapists will evaluate therapeutic potential, risks, and practicality, while adolescent patients will assess comfort, stress responses, and usability. The findings aim to advance the user-centered integration of VR and AR into anorexia nervosa therapy, addressing critical treatment gaps.
Jonathan Tschanter, Christian Merz, Carolin Wienrich, Marc Erich Latoschik, Towards Understanding Harassment in Social Virtual Reality: A Study Design on the Impact of Avatar Self-Similarity, In 2025 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (IEEE VRW). IEEE Computer Science, 2025.
[Download] [BibSonomy]
@inproceedings{tschanter2025harassment, author = {Jonathan Tschanter and Christian Merz and Carolin Wienrich and Marc Erich Latoschik}, url = {https://downloads.hci.informatik.uni-wuerzburg.de/2025-ieeevrw-towards-understanding-harassment-in-social-virtual-reality.pdf}, year = {2025}, booktitle = {2025 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (IEEE VRW)}, publisher = {IEEE Computer Science}, title = {Towards Understanding Harassment in Social Virtual Reality: A Study Design on the Impact of Avatar Self-Similarity} }
Abstract: In social virtual reality (VR), harassment persists as a pervasive and critical issue. Prior work emphasizes its perceived realness and emotional impact. However, the influence of avatar design, particularly the role of self-similarity, remains underexplored. Self-similar avatars can enhance user identification and engagement, yet potentially intensify the psychological and physiological effects of harassment. Existing studies often rely on interviews or user-generated content, lacking systematic analysis and controlled comparisons. To address these gaps, we present a process for creating realistic VR harassment scenarios. We built a scenario based on existing literature and validated it with expert reviews and user feedback. We propose a 2 x 2 between-subjects design to systematically examine users' emotional and physiological states, their identification with avatars, and the effects of avatar self-similarity. The study design will deepen the understanding of harassment dynamics in VR. Additionally, it can provide actionable insights for designing safer, more inclusive virtual environments that promote user well-being and foster equitable communities.
Christian Merz, Carolin Wienrich, Marc Erich Latoschik, Does Task Matter? Task-Dependent Effects of Cross-Device Collaboration on Social Presence, In 2025 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (IEEE VRW). IEEE Computer Science, 2025.
[Download] [BibSonomy] [Doi]
@inproceedings{merz2025taskasymmetry, author = {Christian Merz and Carolin Wienrich and Marc Erich Latoschik}, url = {https://downloads.hci.informatik.uni-wuerzburg.de/2025-ieeevrw-task-cross-device.pdf}, year = {2025}, booktitle = {2025 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (IEEE VRW)}, publisher = {IEEE Computer Science}, doi = {10.1109/VRW66409.2025.00116}, title = {Does Task Matter? Task-Dependent Effects of Cross-Device Collaboration on Social Presence} }
Abstract: In this work, we explored asymmetric collaboration under two distinct tasks: collaborative sorting and conversational talking tasks. We answer the research question of how different tasks impact the user experience in asymmetric interaction. Our mixed design compared one symmetric and one asymmetric interaction and two tasks, assessing self-perception (presence, embodiment), other-perception (co-presence, social presence, plausibility), and task perception (task load, enjoyment). 52 participants collaborated in dyads on the two tasks, either using head-mounted displays (HMDs) or one participant using an HMD and the other a desktop setup. Results indicate that differences in social presence diminished or disappeared during the purely conversational talking task in comparison to the sorting task. This indicates that differences in how we perceive a social interaction, which is caused by asymmetric interaction, only occur during specific use cases. These findings underscore the critical role of task characteristics in shaping users’ social XR experiences and highlight that asymmetric collaboration can be effective across different use cases and is even on par with symmetric interaction during conversations.
Marie Luisa Fiedler, Mario Botsch, Carolin Wienrich, Marc Erich Latoschik, Self-Similarity Beats Motor Control in Augmented Reality Body Weight Perception, In IEEE Transactions on Visualization and Computer Graphics (TVCG), IEEE VR 25 special issue. 2025. Honorable Mention 🏆
[Download] [BibSonomy] [Doi]
@article{fiedler2025selfsimilarity, author = {Marie Luisa Fiedler and Mario Botsch and Carolin Wienrich and Marc Erich Latoschik}, journal = {IEEE Transactions on Visualization and Computer Graphics (TVCG), IEEE VR 25 special issue}, url = {https://downloads.hci.informatik.uni-wuerzburg.de/2025-ieeevr-fiedler-self-similarity-beats-motor-control.pdf}, year = {2025}, doi = {10.1109/TVCG.2025.3549851}, title = {Self-Similarity Beats Motor Control in Augmented Reality Body Weight Perception} }
Abstract: This paper investigates if and how self-similarity and having motor control impact sense of embodiment, self-identification, and body weight perception in Augmented Reality (AR). We conducted a 2x2 mixed design experiment involving 60 participants who interacted with either synchronously moving virtual humans or independently moving ones, each with self-similar or generic appearances, across two consecutive AR sessions. Participants evaluated their sense of embodiment, self-identification, and body weight perception of the virtual human. Our results show that self-similarity significantly enhanced sense of embodiment, self-identification, and the accuracy of body weight estimates with the virtual human. However, the effects of having motor control over the virtual human movements were notably weaker in these measures than in similar VR studies. Further analysis indicated that not only the virtual human itself but also the participants' body weight, self-esteem, and body shape concerns predict body weight estimates across all conditions. Our work advances the understanding of virtual human body weight perception in AR systems, emphasizing the importance of factors such as coherence with the real-world environment.
Franziska Westermeier, Chandni Murmu, Kristopher Kohm, Christopher Pagano, Carolin Wienrich, Sabarish V. Babu, Marc Erich Latoschik, Interpupillary to Inter-Camera Distance of Video See-Through AR and its Impact on Depth Perception, In Proceedings of the 32nd IEEE Virtual Reality conference (VR '25). 2025. to be published
[BibSonomy] [Doi]
@inproceedings{westermeier2025interpupillary, author = {Franziska Westermeier and Chandni Murmu and Kristopher Kohm and Christopher Pagano and Carolin Wienrich and Sabarish V. Babu and Marc Erich Latoschik}, year = {2025}, booktitle = {Proceedings of the 32nd IEEE Virtual Reality conference (VR '25)}, doi = {10.1109/VR59515.2025.00077}, title = {Interpupillary to Inter-Camera Distance of Video See-Through AR and its Impact on Depth Perception} }
Abstract: Interpupillary distance (IPD) is a crucial characteristic of head-mounted displays (HMDs) because it defines an important property for generating a stereoscopic parallax, which is essential for correct depth perception. This is why contemporary HMDs offer adjustable lenses to adapt to users' individual IPDs. However, today's Video See-Through Augmented Reality (VST AR) HMDs use fixed camera placements to reconstruct the stereoscopic view of a user's environment. This leads to a potential mismatch between individual IPD settings and the fixed Inter-Camera Distances (ICD), which in turn can lead to perceptual incongruencies, limiting the usability and potentially the applicability of VST AR in depth-sensitive use cases. To investigate this incongruency between IPD and ICD, we conducted a 2x3 mixed-factor design empirical evaluation using a near-field, open-loop reaching task comparing distance judgments of Virtual Reality (VR) and VST AR. We also explored improvements in reaching performance via perceptual calibration by incorporating a feedback phase between pre- and post-phase conditions, with a particular focus on the influence of IPD-ICD differences. Our Linear Mixed Model (LMM) analysis showed a significant difference between VR and VST AR, a significant effect of IPD-ICD mismatch, as well as a combined effect of both factors. This novel insight and its consequences are discussed specifically for depth perception tasks in AR, eXtended Reality (XR), and potential use cases.
See all publications here
Legal Information