Monitoring Kinematic Constraints of Humanoid Rigs for Physical Exercises in Virtual Reality
This project is already assigned.
Outline
1. Motivation
Humanoid rigs are used to animate avatars or characters. In general, the rig consists of several virtual bones that form the virtual skeleton as a connected system. How and to what extent the joints can move varies. The underlying kinematic models of the rig are mostly mathematical, abstract descriptions of objects moving in space and their relationship. This means that the parameters, such as the position, angle, and velocity of the objects in the kinematic chain do not adhere to any theoretical restrictions. On the other hand, human locomotion is always accompanied by constraints, since the kinematic parameters just mentioned are depending on the properties of each joint. However, not only providing the opportunity to adjust the kinematic constraints of humanoid rigs but mainly signaling users when these limits are violated can make sense when it comes to more specific and realistic applications. To monitor violations of the constraints during the performance of exercises, an additional system is needed that extends current implementations of humanoid rigs. The system intends to support physical therapists and patients in a rehabilitation process by making it possible to modify the limits of the humanoid rig according to reasonable constraints. The theoretical background, the concept of how to develop a system that monitors violations of kinematic constraints of humanoid rigs for rehabilitation scenarios, and the methodological approach with its potential features are explained in more detail in the following.
2. Related Work
Virtual reality rehabilitation applications are becoming increasingly important because they can be more effective than conventional therapy [1]. For virtual reality (VR) rehabilitation, a virtual representation of the user’s body is needed that somehow depicts the user in VR, which is called an avatar. Commonly an avatar has a built-in rig that can be used to map kinematic data to it to represent motion made by the user or the model. Since the avatar is used as a display for the representation of motion data, it is reasonable to use this motion data for direct feedback on the user’s motion in rehabilitation scenarios. To improve VR rehabilitation applications by providing feedback to users on their locomotion, several approaches exist and can be found in the literature. Camporesi, Kallmann, and Han [2] published an interface that evaluates the quality of rehabilitation exercises. Their system monitors information about the motion, the end-effector trajectories, the range of motion, and joint angles, by comparing the avatar of the therapist with the avatar of the patient. The virtual agent of the therapist can be connected to the exercise session to exercise along with the patients or just to demonstrate the exercises remotely with pre-recorded material. The comparison is shown via their user interface, which provides constant feedback for users during the execution of exercises. They suggest that with continued development and research, VR technology has the potential to revolutionize the field of physical therapy and provide significant benefits to patients and healthcare professionals alike. Zhao et al. [3] developed a rule-based system that is processing motion data from a Kinect sensor. They focused on designing a system with real-time feedback given to patients to guide them through exercises, which is especially useful in home exercise scenarios. The rules represent the joints including their kinematic constraints and are stored in XML files, which then are evaluated by a finite state machine during runtime. Their user interface shows the patient ́s avatar and a model-generated avatar. The system provides visual cues and guidance on how to adjust their movement to achieve the correct posture which is represented and demonstrated by the model-generated avatar. Overall, the paper demonstrates the potential of rule-based systems for human motion tracking in rehabilitation exercises, which can improve the quality and efficiency of the rehabilitation process. Zhu et al. [4] present a system that visualizes muscle engagement of the exercising user over an avatar. They try to enhance the sense of motion of patients during physical therapy exercises. They argue and show that measuring muscle activity with EMG sensors and monitoring the results over the avatar is helpful for patients when exercising unsupervised and relieving for physical therapists. Their monitoring system can be visualized not only in VR but also on other types of displays, which is a good prerequisite for using the system for other demonstrative purposes. These three papers [2-4] show that providing feedback over a fully rigged avatar is a promising way to monitor the violations of the kinematic constraints to ensure the safety and correctness of rehabilitation exercise execution. Humanoid rigs, with their virtual bones and joints, form a digital space on which the additional system can be built to monitor violations of the constraints. The goal of my thesis is to extend state-of-the-art humanoid rigs used in virtual environment-enhanced physical therapy insofar, as it helps to understand the patient ́s kinematic constraints better and to guide them when the therapist should be absent, e.g., from a patients perspective, unsupervised. In the following, I present my research questions and the methodological approach in order to answer these questions.
3. Research Questions
(1) How to extend humanoid rigs such that executing rehabilitation exercises can be monitored?
(2) How to signal violations of kinematic constraints of humanoid rigs to therapists and patients over a user interface?
4. Methodology
Overall, I want to develop a monitoring system that provides users with feedback regarding their motion in VR rehabilitation exercise scenarios. The thesis will be structured chronologically by deciding which development environment to use, implementing an internal monitoring system, and a system that provides the users with (visual) feedback on what the internal system processing. Since Unity [5] is a widely used development environment for VR applications, I will use Unity to set up and implement my monitoring system. Additionally, Unity provides the integration of avatar structures. I want to use this framework to implement my monitoring system. Although the framework allows the integration of differently encoded avatars, an interface is needed that ensures that the monitoring system can work with these different data structures. To monitor the kinematic constraints properly, two main features have to be implemented. Firstly, a computational system, that decides whether predefined constraints are violated or not. This includes the definition of humanoid joints in the avatar and their restrictions, which will be oriented to the morphological structure of real humans. Furthermore, the computational system needs a clear definition of what exactly a violation is. As avatar structures do not adhere to any theoretical restrictions concerning their joints, the first feature will be implemented by extending each joint of the rig with a component that is constantly checking if the kinematic constraints are violated. Since every joint has a transform component in Unity, kinematic parameters can be extracted by accessing the transform component for each joint. Nevertheless, the functioning of the system must also go beyond the simple processing of kinematic parameters, such as position, angle, and velocity. Meaning, that it needs another component that puts each joint with its constraints in relation to other joints and their constraints. The second feature includes a visual representation of the violations on the avatar, such that the patient can react to it and change his/her execution of exercises. This includes a proper description of relevant information for therapists and patients and to what extent this information has to be shown over the interface. Additionally, the joints and their constraints need to be modifiable for the users to ensure individualization towards all kinds of patients with differing prerequisites. In general, and for both features, it is important to define software-specific requirements beforehand, which will include a literature review of the theoretical background regarding the use of animation rigs in rehabilitation scenarios, as begun in Related Work, to formulate the requirements, their implementation, and their validation. Since there will be two features implemented in the process of my project, each of them will be validated individually. The computational system can be validated with a generic motion data set such as the KIT Whole-Body Human Motion Database [6] because it is meant to be implemented as a system that computes if different humanoid rigs adhere to realistic, morphological restrictions. The user interface can be validated by quantitatively or qualitatively measuring and evaluating its usability with its targeted users: therapists and patients. Consequently and as an overview, I present the following tasks.
5. Tasks
Regarding research question (1):
▶ Define requirements of the computational humanoid rig extension for rehabilitation exercise monitoring
▶ Implement a system for existing humanoid rigs used in virtual environments, that checks for violations of kinematic constraints
▶ Test the system based on, e.g., the KIT Whole-Body Human Motion Database [6]
Regarding research question (2):
▶ Define requirements of the user interface that monitors violations of the kinematic constraints visually
▶ Implement the user interface
▶ Test the usability of the user interface with potential users
References
[1] A. Asadzadeh, T. Samad-Soltani, Z. Salahzadeh, and P. Rezaei-Hachesu, ‘Effectiveness of virtual reality-based exercise therapy in rehabilitation: A scoping review’, Informatics in Medicine Unlocked, vol. 24, pp. 100562, 2021.
[2] C. Camporesi, M. Kallmann, and J. J. Han, ‘VR solutions for improving physical therapy’, in 2013 IEEE Virtual Reality (VR), 2013, pp. 77–78.
[3] W. Zhao, M. A. Reinthal, D. D. Espy, and X. Luo, ‘Rule-Based Human Motion Tracking for Rehabilitation Exercises: Realtime Assessment, Feedback, and Guidance’, IEEE Access, vol. 5, pp. 21382–21394, 2017.
[4] J. Zhu, Y. Lei, A. Shah, G. Schein, H. Ghaednia, J. Schwab, C. Harteveld, S. Müller, ‘MuscleRehab: Improving Unsupervised Physical Rehabilitation by Monitoring and Visualizing Muscle Engagement’, in Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology, Bend, OR, USA, 2022.
[5] Unity Technologies. (2021). Configuring the Avatar. [Online]. Available: https://docs.unity3d.com/Manual/ConfiguringtheAvatar.html. [Accessed: Apr. 17, 2023].
[6] F. Krebs, A. Meixner, I. Patzer, and T. Asfour, ‘The KIT Bimanual Manipulation Dataset’, in 2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids), 2021, pp. 499–506.
Contact Persons at the University Würzburg
Andrea Bartl (Primary Contact Person)Mensch-Computer-Interaktion, Universität Würzburg
andrea.bartl@uni-wuerzburg.de
Marc Erich Latoschik
Mensch-Computer-Interaktion, Universität Würzburg
marc.latoschik@uni-wuerzburg.de