2025
Lukas Schach, Christian Rack, Ryan P. McMahan, Marc Erich Latoschik,
Motion-Based User Identification across XR and Metaverse Applications by Deep Classification and Similarity Learning
.
2025.
[BibTeX]
[Download]
[BibSonomy]
@misc{schach2025motionbaseduseridentificationxr,
title = {Motion-Based User Identification across XR and Metaverse Applications by Deep Classification and Similarity Learning},
author = {Schach, Lukas and Rack, Christian and McMahan, Ryan P. and Latoschik, Marc Erich},
year = {2025},
url = {https://arxiv.org/abs/2509.08539}
}
Christian Merz, Lukas Schach, Marie Luisa Fiedler, Jean-Luc Lugrin, Carolin Wienrich, Marc Erich Latoschik,
Unobtrusive In-Situ Measurement of Behavior Change by Deep Metric Similarity Learning of Motion Patterns
.
2025.
[BibTeX]
[Download]
[BibSonomy]
@misc{merz2025unobtrusiveinsitumeasurementbehavior,
title = {Unobtrusive In-Situ Measurement of Behavior Change by Deep Metric Similarity Learning of Motion Patterns},
author = {Merz, Christian and Schach, Lukas and Fiedler, Marie Luisa and Lugrin, Jean-Luc and Wienrich, Carolin and Latoschik, Marc Erich},
year = {2025},
url = {https://arxiv.org/abs/2509.04174}
}
2024
Christian Rack, Lukas Schach, Felix Achter, Yousof Shehada, Jinghuai Lin, Marc Erich Latoschik,
Motion Passwords
, In
Proceedings of the 30th ACM Symposium on Virtual Reality Software and Technology
(
19)
, pp. 1-11
.
New York, NY, USA
:
Association for Computing Machinery
, 2024.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@conference{rack2024motion,
title = {Motion Passwords},
author = {Rack, Christian and Schach, Lukas and Achter, Felix and Shehada, Yousof and Lin, Jinghuai and Latoschik, Marc Erich},
booktitle = {Proceedings of the 30th ACM Symposium on Virtual Reality Software and Technology},
year = {2024},
number = {19},
pages = {1-11},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3641825.3687711},
doi = {10.1145/3641825.3687711}
}
Abstract: This paper introduces “Motion Passwords”, a novel biometric authentication approach where virtual reality users verify their identity by physically writing a chosen word in the air with their hand controller. This method allows combining three layers of verification: knowledge-based password input, handwriting style analysis, and motion profile recognition. As a first step towards realizing this potential, we focus on verifying users based on their motion profiles. We conducted a data collection study with 48 participants, who performed over 3800 Motion Password signatures across two sessions. We assessed the effectiveness of feature-distance and similarity-learning methods for motion-based verification using the Motion Passwords as well as specific and uniform ball-throwing signatures used in previous works. In our results, the similarity-learning model was able to verify users with the same accuracy for both signature types. This demonstrates that Motion Passwords, even when applying only the motion-based verification layer, achieve reliability comparable to previous methods. This highlights the potential for Motion Passwords to become even more reliable with the addition of knowledge-based and handwriting style verification layers. Furthermore, we present a proof-of-concept Unity application demonstrating the registration and verification process with our pretrained similarity-learning model. We publish our code, the Motion Password dataset, the pretrained model, and our Unity prototype on https://github.com/cschell/MoPs
Christian Rack, Vivek Nair, Lukas Schach, Felix Foschum, Marcel Roth, Marc Erich Latoschik,
Navigating the Kinematic Maze: Analyzing, Standardizing and Unifying XR Motion Datasets
, In
2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)
.
2024.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@inproceedings{noauthororeditor2024navigating,
title = {Navigating the Kinematic Maze: Analyzing, Standardizing and Unifying XR Motion Datasets},
author = {Rack, Christian and Nair, Vivek and Schach, Lukas and Foschum, Felix and Roth, Marcel and Latoschik, Marc Erich},
booktitle = {2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)},
year = {2024},
url = {http://downloads.hci.informatik.uni-wuerzburg.de/2024-01-Rack-Navigating_the_Kinematic_Maze.pdf},
doi = {10.1109/VRW62533.2024.00098}
}
Abstract: This paper addresses the critical importance of standards and documentation in kinematic research, particularly within Extended Reality (XR) environments. We focus on the pivotal role of motion data, emphasizing the challenges posed by the current lack of standardized practices in XR user motion datasets. Our work involves a detailed analysis of 8 existing datasets, identifying gaps in documentation and essential specifications such as coordinate systems, rotation representations, and units of measurement. We highlight how these gaps can lead to misinterpretations and irreproducible results. Based on our findings, we propose a set of guidelines and best practices for creating and documenting motion datasets, aiming to improve their quality, usability, and reproducibility. We also created a web-based tool for visual inspection of motion recordings, further aiding in dataset evaluation and standardization. Furthermore, we introduce the XR Motion Dataset Catalogue, a collection of the analyzed datasets in a unified and aligned format. This initiative significantly streamlines access for researchers, allowing them to download partial or entire datasets with a single line of code and without the need for additional alignment efforts. Our contributions enhance dataset integrity and reliability in kinematic research, paving the way for more consistent and scientifically robust studies in this evolving field.
2023
Christian Rack, Lukas Schach, Marc Latoschik,
Motion Learning Toolbox – A Python library for preprocessing of XR motion tracking data for machine learning applications
.
2023.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
@misc{rack2023motionlearningtoolbox,
title = {Motion Learning Toolbox – A Python library for preprocessing of XR motion tracking data for machine learning applications},
author = {Rack, Christian and Schach, Lukas and Latoschik, Marc},
year = {2023},
url = {https://github.com/cschell/Motion-Learning-Toolbox}
}
Abstract: The Motion Learning Toolbox is a Python library designed to facilitate the preprocessing of motion tracking data in extended reality (XR) setups. It's particularly useful for researchers and engineers wanting to use XR tracking data as input for machine learning models. Originally developed for academic research targeting the identification of XR users by their motions, this toolbox includes a variety of data encoding methods that enhance machine learning model performance.
2022
Christian Rack, Fabian Sieper, Lukas Schach, Murat Yalcin, Marc E. Latoschik,
Dataset: Who is Alyx? (GitHub Repository)
.
2022.
[BibTeX]
[Abstract]
[Download]
[BibSonomy]
[Doi]
@dataset{who_is_alyx_2022,
title = {Dataset: Who is Alyx? (GitHub Repository)},
author = {Rack, Christian and Sieper, Fabian and Schach, Lukas and Yalcin, Murat and Latoschik, Marc E.},
year = {2022},
url = {https://github.com/cschell/who-is-alyx},
doi = {10.5281/zenodo.6472417}
}
Abstract: This dataset contains over 110 hours of motion, eye-tracking and physiological data from 71 players of the virtual reality game “Half-Life: Alyx”. Each player played the game on two separate days for about 45 minutes using a HTC Vive Pro.