2026
Florian Kern, Lukas Polifke, Paula Friedrich, Marc Erich Latoschik, Carolin Wienrich, David Obremski,
CECA - A Configurable Framework for Embodied Conversational AI Agents in Extended Reality
, In
2026 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)
.
2026.
To be published
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
@inproceedings{kern2026configurable,
title = {CECA - A Configurable Framework for Embodied Conversational AI Agents in Extended Reality},
author = {Kern, Florian and Polifke, Lukas and Friedrich, Paula and Latoschik, Marc Erich and Wienrich, Carolin and Obremski, David},
booktitle = {2026 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)},
year = {2026},
note = {To be published},
url = {}
}
Abstract: We present CECA, a configurable framework for embodied conversational AI agents in Unity-based extended reality (XR) applications. CECA employs a client–server architecture to decouple agent logic from game engine–based embodiment. Built on LiveKit Agents, our approach integrates speech-to-text (STT), large language models (LLMs), and text-to-speech (TTS) into a unified, streaming voice-to-voice pipeline configured via metadata rather than code changes. We outline how this architecture flexibly integrates local and cloud AI providers while mitigating limited provider SDK support in Unity. Finally, we highlight opportunities for future work, including multi-agent scenarios, higher-level templates for XR research, and systematic user studies.
2025
Florian Kern,
Using Controller Styluses for Virtual Keyboards and Handwriting Text Input in XR
.
Universität Würzburg
, 2025.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@phdthesis{https://doi.org/10.25972/opus-42563,
title = {Using Controller Styluses for Virtual Keyboards and Handwriting Text Input in XR},
author = {Kern, Florian},
year = {2025},
publisher = {Universität Würzburg},
url = {https://opus.bibliothek.uni-wuerzburg.de/42563},
doi = {10.25972/OPUS-42563}
}
Abstract: This dissertation investigates the feasibility and applicability of repurposing consumer-grade XR controllers as controller styluses and evaluates their impact on the performance and user experience of virtual tap and swipe keyboards and handwriting text input in XR environments. Text input is a core feature of many XR applications, enabling tasks such as documenting, note-taking, chatting, and web browsing. However, XR, encompassing VR, AR, and MR, presents distinct challenges that limit traditional text input methods like physical keyboards or handwriting with pen and paper. As an alternative, prior research explored virtual keyboards and handwriting text input in VR and OST AR, utilizing XR controllers held in the conventional power grip or hand tracking. Yet, fundamental research gaps remained. These include the feasibility and applicability of repurposing consumer-grade XR controllers as controller styluses by holding them in a pen-like posture, such as the precision grip, integrating diverse XR devices and input modalities, comparing the performance and user experience of text input methods in VR and VST AR, and understanding the impact of mid-air and physically aligned virtual surfaces. To address these gaps, this dissertation introduces the OTSS, a modular and extensible framework for repurposing consumer-grade XR controllers as controller styluses equipped with self-made or 3D-printed stylus accessories. OTSS also incorporates virtual-to-physical alignment and refinement techniques to align virtual surfaces to physical counterparts or freely place them in mid-air. Additionally, this dissertation presents the RSIO framework, an intermediate layer designed to simplify and unify cross-device and cross-platform XR application development. A series of user studies and technical evaluations demonstrate the applicability and versatility of the OTSS and RSIO frameworks. Building on these frameworks, two user studies involving a total of 136 participants provide detailed insights into the performance and user experience of virtual tap and swipe keyboards and handwriting text input in VR and VST AR. The findings underscore the potential of controller styluses for precise touch-based interaction on mid-air and physically aligned virtual surfaces, particularly when equipped with pressure-sensitive stylus tips for physical contact detection. Moreover, the results indicate that visual incongruencies are a distinct challenge in VST AR and suggest that while physical surfaces are desirable for text input in XR, they are not indispensable in mobile XR scenarios. Publicly available reference implementations are provided to establish a foundation for future research and the development of XR text input methods for professional, educational, and personal environments.
2024
Samantha Monty, Florian Kern, Marc Erich Latoschik,
Analysis of Immersive Mid-Air Sketching Behavior, Sketch Quality, and
User Experience in Design Ideation Tasks
, In
23rd IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
.
IEEE Computer Society
, 2024.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@inproceedings{monty2024,
title = {Analysis of Immersive Mid-Air Sketching Behavior, Sketch Quality, and
User Experience in Design Ideation Tasks},
author = {Monty, Samantha and Kern, Florian and Latoschik, Marc Erich},
booktitle = {23rd IEEE International Symposium on Mixed and Augmented Reality (ISMAR)},
year = {2024},
publisher = {IEEE Computer Society},
url = {https://ieeexplore.ieee.org/document/10765456},
doi = {10.1109/ISMAR62088.2024.00041}
}
Abstract: Immersive 3D sketching systems empower users with tools to create
sketches directly in the air around themselves, in all three dimensions,
using only simple hand gestures. These sketching systems
have the potential to greatly extend the interactive capabilities
of immersive learning environments. The perceptual challenges of
Virtual Reality (VR), however, combined with the ergonomic and
cognitive challenges of creating mid-air 3D sketches reduce the effectiveness
of immersive sketching used for problem-solving, reflection,
and to capture fleeting ideas. We contribute to the understanding
of the potential challenges of mid-air sketching systems in
educational settings, where expression is valued higher than accuracy,
and sketches are used to support problem-solving and to explain
abstract concepts. We conducted an empirical study with 36
participants with different spatial abilities to investigate if the way
that people sketch in mid-air is dependent on the goal of the sketch.
We compare the technique, quality, efficiency, and experience of
participants as they create 3D mid-air sketches in three different
tasks. We examine how users approach mid-air sketching when the
sketches they create serve to convey meaning and when sketches are
merely reproductions of geometric models created by someone else.
We found that in tasks aimed at expressing personal design ideas,
between starting and ending strokes, participants moved their heads
more and their controllers at higher velocities and created strokes
in faster times than in tasks aimed at recreating 3D geometric figures.
They reported feeling less time pressure to complete sketches
but redacted a larger percentage of strokes. These findings serve to
inform the design of creative virtual environments that support reasoning
and reflection through mid-air sketching. With this work, we
aim to strengthen the power of immersive systems that support midair
3D sketching by exploiting natural user behavior to assist users
to more quickly and faithfully convey their meaning in sketches.
Florian Kern, Jonathan Tschanter, Marc Erich Latoschik,
Handwriting for Text Input and the Impact of XR Displays, Surface Alignments, and Sentence Complexities
, In
IEEE Transactions on Visualization and Computer Graphics
, pp. 1-11
.
2024.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@article{10460576,
title = {Handwriting for Text Input and the Impact of XR Displays, Surface Alignments, and Sentence Complexities},
author = {Kern, Florian and Tschanter, Jonathan and Latoschik, Marc Erich},
journal = {IEEE Transactions on Visualization and Computer Graphics},
year = {2024},
pages = {1-11},
url = {https://ieeexplore.ieee.org/document/10460576},
doi = {10.1109/TVCG.2024.3372124}
}
Abstract: Text input is desirable across various eXtended Reality (XR) use cases and is particularly crucial for knowledge and office work. This article compares handwriting text input between Virtual Reality (VR) and Video See-Through Augmented Reality (VST AR), facilitated by physically aligned and mid-air surfaces when writing simple and complex sentences. In a 2x2x2 experimental design, 72 participants performed two ten-minute handwriting sessions, each including ten simple and ten complex sentences representing text input in real-world scenarios. Our developed handwriting application supports different XR displays, surface alignments, and handwriting recognition based on digital ink. We evaluated usability, user experience, task load, text input performance, and handwriting style. Our results indicate high usability with a successful transfer of handwriting skills to the virtual domain. XR displays and surface alignments did not impact text input speed and error rate. However, sentence complexities did, with participants achieving higher input speeds and fewer errors for simple sentences (17.85 WPM, 0.51% MSD ER) than complex sentences (15.07 WPM, 1.74% MSD ER). Handwriting on physically aligned surfaces showed higher learnability and lower physical demand, making them more suitable for prolonged handwriting sessions. Handwriting on mid-air surfaces yielded higher novelty and stimulation ratings, which might diminish with more experience. Surface alignments and sentence complexities significantly affected handwriting style, leading to enlarged and more connected cursive writing in both mid-air and for simple sentences. The study also demonstrated the benefits of using XR controllers in a pen-like posture to mimic styluses and pressure-sensitive tips on physical surfaces for input detection. We additionally provide a phrase set of simple and complex sentences as a basis for future text input studies, which can be expanded and adapted.
Christian Merz, Jonathan Tschanter, Florian Kern, Jean-Luc Lugrin, Carolin Wienrich, Marc Erich Latoschik,
Pipelining Processors for Decomposing Character Animation
, In
30th ACM Symposium on Virtual Reality Software and Technology
.
New York, NY, USA
:
Association for Computing Machinery
, 2024.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@inproceedings{merz2024processor,
title = {Pipelining Processors for Decomposing Character Animation},
author = {Merz, Christian and Tschanter, Jonathan and Kern, Florian and Lugrin, Jean-Luc and Wienrich, Carolin and Latoschik, Marc Erich},
booktitle = {30th ACM Symposium on Virtual Reality Software and Technology},
year = {2024},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3641825.3689533},
doi = {10.1145/3641825.3689533}
}
Abstract: This paper presents an openly available implementation of a modular pipeline architecture for character animation. It effectively decomposes frequently necessary processing steps into dedicated character processors, such as copying data from various motion sources, applying inverse kinematics, or scaling the character. Processors can easily be parameterized, extended (e.g., with AI), and freely arranged or even duplicated in any order necessary, greatly reducing side effects and fostering fine-tuning, maintenance, and reusability of the complex interplay of real-time animation steps.
2023
Florian Kern, Marc Erich Latoschik,
Reality Stack I/O: A Versatile and Modular Framework for Simplifying and Unifying XR Applications and Research
, In
2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)
, pp. 74-76
.
2023.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@inproceedings{10322199,
title = {Reality Stack I/O: A Versatile and Modular Framework for Simplifying and Unifying XR Applications and Research},
author = {Kern, Florian and Latoschik, Marc Erich},
booktitle = {2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)},
year = {2023},
pages = {74-76},
url = {https://ieeexplore.ieee.org/document/10322199},
doi = {10.1109/ISMAR-Adjunct60411.2023.00023}
}
Abstract: This paper introduces Reality Stack I/O (RSIO), a versatile and modular framework designed to facilitate the development of extended reality (XR) applications. Researchers and developers often spend a significant amount of time enabling cross-device and cross-platform compatibility, leading to delays and increased complexity. RSIO provides the essential features to simplify and unify the development of XR applications. It enhances cross-device and cross-platform compatibility, expedites integration, and allows developers to focus more on building XR experiences rather than device integration. We offer a public Unity reference implementation with examples.
Florian Kern, Florian Niebling, Marc Erich Latoschik,
Text Input for Non-Stationary XR Workspaces: Investigating Tap and Word-Gesture Keyboards in Virtual and Augmented Reality
, In
IEEE Transactions on Visualization and Computer Graphics
, pp. 2658-2669
.
2023.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@article{kern2023input,
title = {Text Input for Non-Stationary XR Workspaces: Investigating Tap and Word-Gesture Keyboards in Virtual and Augmented Reality},
author = {Kern, Florian and Niebling, Florian and Latoschik, Marc Erich},
journal = {IEEE Transactions on Visualization and Computer Graphics},
year = {2023},
pages = {2658--2669},
url = {https://ieeexplore.ieee.org/document/10049665/},
doi = {10.1109/TVCG.2023.3247098}
}
Abstract: This article compares two state-of-the-art text input techniques between non-stationary virtual reality (VR) and video see-through augmented reality (VST AR) use-cases as XR display condition. The developed contact-based mid-air virtual tap and wordgesture (swipe) keyboard provide established support functions for text correction, word suggestions, capitalization, and punctuation. A user evaluation with 64 participants revealed that XR displays and input techniques strongly affect text entry performance, while subjective measures are only influenced by the input techniques. We found significantly higher usability and user experience ratings for tap keyboards compared to swipe keyboards in both VR and VST AR. Task load was also lower for tap keyboards. In terms of performance, both input techniques were significantly faster in VR than in VST AR. Further, the tap keyboard was significantly faster than the swipe keyboard in VR. Participants showed a significant learning effect with only ten sentences typed per condition. Our results are consistent with previous work in VR and optical see-through (OST) AR, but additionally provide novel insights into usability and performance of the selected text input techniques for VST AR. The significant differences in subjective and objective measures emphasize the importance of specific evaluations for each possible combination of input techniques and XR displays to provide reusable, reliable, and high-quality text input solutions. With our work, we form a foundation for future research and XR workspaces. Our reference implementation is publicly available to encourage replicability and reuse in future XR workspaces.
Florian Kern, Jonathan Tschanter, Marc Erich Latoschik,
Virtual-to-Physical Surface Alignment and Refinement Techniques for Handwriting, Sketching, and Selection in XR
, In
2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)
, pp. 502-506
.
2023.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@inproceedings{10108564,
title = {Virtual-to-Physical Surface Alignment and Refinement Techniques for Handwriting, Sketching, and Selection in XR},
author = {Kern, Florian and Tschanter, Jonathan and Latoschik, Marc Erich},
booktitle = {2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)},
year = {2023},
pages = {502-506},
url = {https://ieeexplore.ieee.org/document/10108564/},
doi = {10.1109/VRW58643.2023.00109}
}
Abstract: The alignment of virtual to physical surfaces is essential to improve symbolic input and selection in XR. Previous techniques optimized for efficiency can lead to inaccuracies. We investigate regression-based refinement techniques and introduce a surface accuracy eval-uation. The results revealed that refinement techniques can highly improve surface accuracy and show that accuracy depends on the gesture shape and surface dimension. Our reference implementation and dataset are publicly available.
2021
Florian Kern, Matthias Popp, Peter Kullmann, Elisabeth Ganal, Marc Erich Latoschik,
3D Printing an Accessory Dock for XR Controllers and its Exemplary Use as XR Stylus
, In
27th ACM Symposium on Virtual Reality Software and Technology
, pp. 1-3
.
Osaka, Japan
:
Association for Computing Machinery
, 2021.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@inproceedings{kern2021printing,
title = {3D Printing an Accessory Dock for XR Controllers and its Exemplary Use as XR Stylus},
author = {Kern, Florian and Popp, Matthias and Kullmann, Peter and Ganal, Elisabeth and Latoschik, Marc Erich},
booktitle = {27th ACM Symposium on Virtual Reality Software and Technology},
year = {2021},
pages = {1-3},
publisher = {Association for Computing Machinery},
address = {Osaka, Japan},
url = {https://doi.org/10.1145/3489849.3489949},
doi = {10.1145/3489849.3489949}
}
Abstract: This article introduces the accessory dock, a 3D printed multipurpose extension for consumer-grade XR controllers that enables flexible mounting of self-made and commercial accessories. The uniform design of our concept opens new opportunities for XR systems being used for more diverse purposes, e.g., researchers and practitioners could use and compare arbitrary XR controllers within their experiments while ensuring access to buttons and battery housing. As a first example, we present a stylus tip accessory to build an XR Stylus, which can be directly used with frameworks for handwriting, sketching, and UI interaction on physically aligned virtual surfaces. For new XR controllers, we provide instructions on how to adjust the accessory dock to the controller’s form factor. A video tutorial for the construction and the source files for 3D printing are publicly available for reuse, replication, and extension (https://go.uniwue.de/hci-otss-accessory-dock).
Carla Winter, Florian Kern, Dominik Gall, Marc Erich Latoschik, Paul Pauli, Ivo Käthner,
Immersive virtual reality during gait rehabilitation increases walking speed and motivation: A usability evaluation with healthy participants and patients with multiple sclerosis and stroke
, In
Journal of NeuroEngineering and Rehabilitation
, Vol.
18
(
68)
.
2021.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@article{winter2021immersive,
title = {Immersive virtual reality during gait rehabilitation increases walking speed and motivation: A usability evaluation with healthy participants and patients with multiple sclerosis and stroke},
author = {Winter, Carla and Kern, Florian and Gall, Dominik and Latoschik, Marc Erich and Pauli, Paul and Käthner, Ivo},
journal = {Journal of NeuroEngineering and Rehabilitation},
year = {2021},
volume = {18},
number = {68},
url = {https://jneuroengrehab.biomedcentral.com/articles/10.1186/s12984-021-00848-w},
doi = {https://doi.org/10.1186/s12984-021-00848-w}
}
Abstract: Background. The rehabilitation of gait disorders in multiple sclerosis (MS) and stroke patients is often based on conventional treadmill training. Virtual reality (VR)-based treadmill training can increase motivation and improve therapy outcomes.
Objective. The present study aimed at (1) demonstrating the feasibility and acceptance of an immersive virtual reality application (presented via head-mounted display, HMD) for gait rehabilitation with patients, and (2) compare its effects to a semi-immersive presentation (via a monitor) and a conventional treadmill training without VR.
Methods and results. 36 healthy participants and 14 persons with MS or stroke participated in each of the three experimental conditions. For both groups, the walking speed in the HMD condition was higher than in treadmill training without VR. Healthy participants reported a higher motivation after the HMD condition as compared with the other conditions. Importantly, no side effects in the sense of simulator sickness occurred and usability ratings were high. Most of the healthy study participants (89 %) and patients (71 %) preferred the HMD-based training among the three conditions and most patients could imagine using it more frequently.
Conclusion. The study demonstrated the feasibility of combining a treadmill training with immersive VR. Due to its high usability and low side effects, the immersive system could serve as a valid alternative to conventional treadmill training in gait rehabilitation. It might be particularly suited for patients to improve training motivation and training outcome e. g. the walking speed compared with treadmill training using no or only semi-immersive VR.
Florian Kern, Peter Kullmann, Elisabeth Ganal, Kristof Korwisi, Rene Stingl, Florian Niebling, Marc Erich Latoschik,
Off-The-Shelf Stylus: Using XR Devices for Handwriting and Sketching on Physically Aligned Virtual Surfaces
, In
Frontiers in Virtual Reality
Daniel Zielasko (Ed.),
, Vol.
2
, p. 69
.
2021.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@article{kern2021offtheshelf,
title = {Off-The-Shelf Stylus: Using XR Devices for Handwriting and Sketching on Physically Aligned Virtual Surfaces},
author = {Kern, Florian and Kullmann, Peter and Ganal, Elisabeth and Korwisi, Kristof and Stingl, Rene and Niebling, Florian and Latoschik, Marc Erich},
editor = {Zielasko, Daniel},
journal = {Frontiers in Virtual Reality},
year = {2021},
volume = {2},
pages = {69},
url = {https://www.frontiersin.org/articles/10.3389/frvir.2021.684498},
doi = {10.3389/frvir.2021.684498}
}
Abstract: This article introduces the Off-The-Shelf Stylus (OTSS), a framework for 2D interaction (in 3D) as well as for handwriting and sketching with digital pen, ink, and paper on physically aligned virtual surfaces in Virtual, Augmented, and Mixed Reality (VR, AR, MR: XR for short). OTSS supports self-made XR styluses based on consumer-grade six-degrees-of-freedom XR controllers and commercially available styluses. The framework provides separate modules for three basic but vital features: 1) The stylus module provides stylus construction and calibration features. 2) The surface module provides surface calibration and visual feedback features for virtual-physical 2D surface alignment using our so-called 3ViSuAl procedure, and surface interaction features. 3) The evaluation suite provides a comprehensive test bed combining technical measurements for precision, accuracy, and latency with extensive usability evaluations including handwriting and sketching tasks based on established visuomotor, graphomotor, and handwriting research. The framework’s development is accompanied by an extensive open source reference implementation targeting the Unity game engine using an Oculus Rift S headset and Oculus Touch controllers. The development compares three low-cost and low-tech options to equip controllers with a tip and includes a web browser-based surface providing support for interacting, handwriting, and sketching. The evaluation of the reference implementation based on the OTSS framework identified an average stylus precision of 0.98 mm (SD = 0.54 mm) and an average surface accuracy of 0.60 mm (SD = 0.32 mm) in a seated VR environment. The time for displaying the stylus movement as digital ink on the web browser surface in VR was 79.40 ms on average (SD = 23.26 ms), including the physical controller’s motion-to-photon latency visualized by its virtual representation (M = 42.57 ms, SD = 15.70 ms). The usability evaluation (N = 10) revealed a low task load, high usability, and high user experience. Participants successfully reproduced given shapes and created legible handwriting, indicating that the OTSS and it’s reference implementation is ready for everyday use. We provide source code access to our implementation, including stylus and surface calibration and surface interaction features, making it easy to reuse, extend, adapt and/or replicate previous results (https://go.uniwue.de/hci-otss).
Florian Kern, Thore Keser, Florian Niebling, Marc Erich Latoschik,
Using Hand Tracking and Voice Commands to Physically Align Virtual Surfaces in AR for Handwriting and Sketching with HoloLens 2
, In
27th ACM Symposium on Virtual Reality Software and Technology
, pp. 1-3
.
Osaka, Japan
:
Association for Computing Machinery
, 2021.
Best poster award. 🏆
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@inproceedings{kern2021using,
title = {Using Hand Tracking and Voice Commands to Physically Align Virtual Surfaces in AR for Handwriting and Sketching with HoloLens 2},
author = {Kern, Florian and Keser, Thore and Niebling, Florian and Latoschik, Marc Erich},
booktitle = {27th ACM Symposium on Virtual Reality Software and Technology},
year = {2021},
pages = {1-3},
publisher = {Association for Computing Machinery},
address = {Osaka, Japan},
note = {Best poster award. 🏆},
url = {https://doi.org/10.1145/3489849.3489940},
doi = {10.1145/3489849.3489940}
}
Abstract: In this paper, we adapt an existing VR framework for handwriting and sketching on physically aligned virtual surfaces to AR environments using the Microsoft HoloLens 2. We demonstrate a multimodal input metaphor to control the framework’s calibration features using hand tracking and voice commands. Our technical evaluation of fingertip/surface accuracy and precision on physical tables and walls is in line with existing measurements on comparable hardware, albeit considerably lower compared to previous work using controller-based VR devices. We discuss design considerations and the benefits of our unified input metaphor suitable for controller tracking and hand tracking systems. We encourage extensions and replication by providing a publicly available reference implementation (https://go.uniwue.de/hci-otss-hololens).
2019
Jean-Luc Lugrin, Florian Kern, Constantin Kleinbeck, Daniel Roth, Christian Daxery, Tobias Feigl, Christopher Mutschler, Marc Erich Latoschik,
A Framework for Location-Based VR Applications
, In
Proceedings of the GI VR/AR - Workshop
.
Shaker Verlag
, 2019.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@inproceedings{LugrinHolopark2019,
title = {A Framework for Location-Based VR Applications},
author = {Lugrin, Jean-Luc and Kern, Florian and Kleinbeck, Constantin and Roth, Daniel and Daxery, Christian and Feigl, Tobias and Mutschler, Christopher and Latoschik, Marc Erich},
booktitle = {Proceedings of the GI VR/AR - Workshop},
year = {2019},
publisher = {Shaker Verlag},
url = {http://downloads.hci.informatik.uni-wuerzburg.de/2019-gi-vr-ar-framework-for-location-based-vr-applications.pdf},
doi = {http://dx.doi.org/10.2370/9783844068870}
}
Abstract: This paper presents a framework to develop and investigate location-based Virtual Reality (VR) applications. We demonstrate our framework by introducing a novel type of VR museum, designed to support a large number of simultaneous co-located users. These visitors are walking in a hangar-scale tracking zone (600 m2), while sharing a ten times bigger virtual space (7000 m2). Co-located VR applications like this one are opening novel VR perspectives. However, sharing a limitless virtual world using a large, but limited, tracking space is also raising numerous challenges: from financial considerations and technical implementation to interactions and evaluations (e.g., user’s representation, navigation, health & safety, monitoring). How to design, develop and evaluate such a VR system is still an open question. Here, we describe a fully implemented framework with its specific features and performance optimizations. We also illustrate our framework’s viability with a first VR application and discuss its potential benefits for education and future evaluation.
Florian Kern, Carla Winter, Dominik Gall, Ivo Käthner, Paul Pauli, Marc Erich Latoschik,
Immersive Virtual Reality and Gamification Within Procedurally Generated Environments to Increase Motivation During Gait Rehabilitation
, In
2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)
, pp. 500-509
.
2019.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@inproceedings{kern2019immersive,
title = {Immersive Virtual Reality and Gamification Within Procedurally Generated Environments to Increase Motivation During Gait Rehabilitation},
author = {Kern, Florian and Winter, Carla and Gall, Dominik and Käthner, Ivo and Pauli, Paul and Latoschik, Marc Erich},
booktitle = {2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)},
year = {2019},
pages = {500-509},
url = {https://downloads.hci.informatik.uni-wuerzburg.de/2019-ieeevr-homecoming.pdf},
doi = {10.1109/VR.2019.8797828}
}
Abstract: Virtual Reality (VR) technology offers promising opportunities to improve traditional treadmill-based rehabilitation programs. We present an immersive VR rehabilitation system that includes a head-mounted display and motion sensors. The application is designed to promote the experience of relatedness, autonomy, and competence. The application uses procedural content generation to generate diverse landscapes. We evaluated the effect of the immersive rehabilitation system on motivation and affect. We conducted a repeated measures study with 36 healthy participants to compare the immersive program to a traditional rehabilitation program. Participants reported significant greater enjoyment, felt more competent and experienced higher decision freedom and meaningfulness in the immersive VR gait training compared to the traditional training. They experienced significantly lower physical demand, simulator sickness, and state anxiety, and felt less pressured while still perceiving a higher personal performance. We derive three design implications for future applications in gait rehabilitation: Immersive VR provides a promising augmentation for gait rehabilitation. Gamification features provide a design guideline for content creation in gait rehabilitation. Relatedness and autonomy provide critical content features in gait rehabilitation.
Marc Erich Latoschik, Florian Kern, Jan-Philipp Stauffert, Andrea Bartl, Mario Botsch, Jean-Luc Lugrin,
Not Alone Here?! Scalability and User Experience of Embodied Ambient Crowds in Distributed Social Virtual Reality
, In
IEEE Transactions on Visualization and Computer Graphics (TVCG)
, Vol.
25
(
5)
, pp. 2134-2144
.
2019.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@article{latoschik2019alone,
title = {Not Alone Here?! Scalability and User Experience of Embodied Ambient Crowds in Distributed Social Virtual Reality},
author = {Latoschik, Marc Erich and Kern, Florian and Stauffert, Jan-Philipp and Bartl, Andrea and Botsch, Mario and Lugrin, Jean-Luc},
journal = {IEEE Transactions on Visualization and Computer Graphics (TVCG)},
year = {2019},
volume = {25},
number = {5},
pages = {2134-2144},
url = {https://ieeexplore.ieee.org/document/8643417},
doi = {10.1109/TVCG.2019.2899250}
}
Abstract: This article investigates performance and user experience in Social Virtual Reality (SVR) targeting distributed, embodied, and immersive, face-to-face encounters. We demonstrate the close relationship between scalability, reproduction accuracy, and the resulting performance characteristics, as well as the impact of these characteristics on users co-located with larger groups of embodied virtual others. System scalability provides a variable number of co-located avatars and AI-controlled agents with a variety of different appearances, including realistic-looking virtual humans generated from photogrammetry scans. The article reports on how to meet the requirements of embodied SVR with today\u0027s technical off-the-shelf solutions and what to expect regarding features, performance, and potential limitations. Special care has been taken to achieve low latencies and sufficient frame rates necessary for reliable communication of embodied social signals. We propose a hybrid evaluation approach which coherently relates results from technical benchmarks to subjective ratings and which confirms required performance characteristics for the target scenario of larger distributed groups. A user-study reveals positive effects of an increasing number of co-located social companions on the quality of experience of virtual worlds, i.e., on presence, possibility of interaction, and co-presence. It also shows that variety in avatar/agent appearance might increase eeriness but might also stimulate an increased interest of participants about the environment.
Carla Winter, Florian Kern, Ivo Käthner, Dominik Gall, Marc Erich Latoschik, Paul Pauli,
Virtuelle Realität als Ergänzung des Laufbandtrainings zur Rehabilitation von Gangstörungen bei Patienten mit Schlaganfall und Multipler Sklerose
, In
Ethik in der Medizin
, Vol.
14
(
15)
.
2019.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
@presentation{winter2019virtuelle,
title = {Virtuelle Realität als Ergänzung des Laufbandtrainings zur Rehabilitation von Gangstörungen bei Patienten mit Schlaganfall und Multipler Sklerose},
author = {Winter, Carla and Kern, Florian and Käthner, Ivo and Gall, Dominik and Latoschik, Marc Erich and Pauli, Paul},
journal = {Ethik in der Medizin},
year = {2019},
volume = {14},
number = {15},
url = {}
}
Abstract: Die Technik der virtuellen Realität (VR) bietet neue Behandlungsmöglichkeiten in der Rehabilitation neurologischer Erkrankungen. Vorherige Studien haben gezeigt, dass ein VR-basiertes Laufbandtraining bei Patienten mit Gangstörungen nicht nur den physischen, sondern auch den psychischen Therapieerfolg steigert und damit eine sinnvolle Ergänzung zum herkömmlichen Gangtraining darstellt.
In der vorliegenden Studie wurde untersucht, welche Auswirkungen die immersive Darbietung einer virtuellen Umgebung (über ein Head-Mounted-Display, HMD) gegenüber der semi-immersiven Darbietung der VR (über einen Flachbildmonitor) und dem herkömmlichen Laufbandtraining ohne VR hat.
Dazu durchliefen zunächst 36 gesunde Probanden und anschließend 14 MS- bzw. Schlaganfallpatienten mit Gangstörungen jeweils die drei verschiedenen Laufbandbedingungen (immersiv, semi-immersiv und ohne VR).
Die eingesetzte virtuelle Umgebung enthielt Gamification-Elemente zur Motivationssteigerung und wurde auf Grundlage der Selbstbestimmungstheorie nach Ryan und Deci implementiert. Die Studie mit gesunden Probanden diente dazu, die Gebrauchstauglichkeit (Usability) zu prüfen und technische Defizite aufzudecken. In einer Proof-of-Concept-Studie mit 14 MS- und Schlaganfallpatienten wurde anschließend anhand der Anwendung getestet, ob Patienten mit einem VR-gestützten Laufbandtraining im Rahmen der Behandlung ihrer Gangstörungen (EDSS < 6) ihre Lauffähigkeiten verbessern können. Primäres Outcome Maß war in beiden Studien die durchschnittliche Laufgeschwindigkeit innerhalb der einzelnen Bedingungen. Mittels standardisierter Fragebögen wurden zusätzlich Motivation, Benutzerfreundlichkeit, Präsenzerleben (Igroup Presence Questionnaire) und Nebenwirkungen des VR-Systems (Simulator Sickness Questionnaire) untersucht. Außerdem wurde die Präferenz der Studienteilnehmer in Bezug auf die drei Bedingungen erfragt.
Sowohl in der Studie mit gesunden Teilnehmern als auch in der Patientenstudie zeigte sich bei der HMD-Bedingung eine signifikant höhere durchschnittliche Laufgeschwindigkeit als beim Laufbandtraining ohne VR. Das Präsenzerleben war in beiden Studien in der HMD-Bedingung signifikant höher als in der Monitor-Bedingung. Darüber hinaus sind keine Nebenwirkungen durch die virtuelle Welt im Sinne einer Simulator Sickness aufgetreten. Die Studienteilnehmer hatten keine relevanten VR-bedingten Haltungsschwierigkeiten oder Probleme in Bezug auf das visuelle Display. Während die Motivation bei den gesunden Probanden nach dem HMD-Durchgang im Vergleich zu den anderen beiden Bedingungen höher ausfiel, wurden in der Patientenstudie keine signifikanten Unterschiede detektiert. Dennoch gaben die Patienten an, die virtuelle Welt in der HMD-Bedingung als motivierender empfunden zu haben als in der Monitor-Bedingung. Unter allen drei Bedingungen wurde das HMD-Laufbandtraining von 71 % (n = 14) der Patienten und 89 % (n = 36) der gesunden Versuchsteilnehmer präferiert. Ebenfalls 71 % der Patienten könnten sich vorstellen, das HMD-Laufbandtraining in Zukunft häufiger zu nutzen.
2018
Jean-Luc Lugrin, Florian Kern, Ruben Schmidt, Constantin Kleinbeck, Daniel Roth, Christian Daxer, Tobias Feigl, Christopher Mutschler, Marc Erich Latoschik,
A Location-Based VR Museum
, In
Proceedings of the 10th IEEE International Conference on Virtual Worlds for Serious Applications (VS-Games)
IEEE (Ed.),
, pp. 1-2
.
IEEE
, 2018.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@inproceedings{lugrin2018locationbased,
title = {A Location-Based VR Museum},
author = {Lugrin, Jean-Luc and Kern, Florian and Schmidt, Ruben and Kleinbeck, Constantin and Roth, Daniel and Daxer, Christian and Feigl, Tobias and Mutschler, Christopher and Latoschik, Marc Erich},
editor = {IEEE, },
booktitle = {Proceedings of the 10th IEEE International Conference on Virtual Worlds for Serious Applications (VS-Games)},
year = {2018},
pages = {1-2},
publisher = {IEEE},
url = {https://downloads.hci.informatik.uni-wuerzburg.de/2018-lugrin-vrmuseum-vsgames.pdf},
doi = {10.1109/VS-Games.2018.8493404}
}
Abstract: This poster presents a novel type of Virtual Reality (VR) application for education and culture: a location-based VR Museum, which is a large-room scale multi-user multi-zone virtual museum. This VR museum was designed to support over 100 simultaneous users, walking in a large tracking system (600 m2) and sharing a ten times bigger virtual space (7000 m2) containing indoor and outdoor dinosaur exhibitions. This poster is giving an overview of the system and its main features as well as discussing its potential benefits and future evaluation.