ImBo Training: Immersive Embodiment Training
This project is already completed.

Overview
The project aims is to explore the impact of immersion and avatar embodiment on user’s behaviour, experience and performance in the context of sport training / therapy application.

In other words: “If you see yourself in a virtual body, which moves as you do, how will this affect your behaviour inside the virtual environment and outside of it, in real-life?”

Numerous previous studies demonstrated that avatar’s appearance is considerably influencing user’s behaviour during and after the VR application. For instance, users with a good-looking avatar significantly display higher confidence and enjoyment during and after the experiment. Users were indeed a lot more closer than other avatars when discussing (+ 3 meters) with them, and with a lot more eye contact as well as dominant voice tone.
In the meantime, numerous research confirmed that virtual training or rehabilitation is considerably increasing the user’s engagement, enjoyment, progress and performance. In addition, studies revealed that the motor skills learned during a virtual session are indeed transferred to real-life equivalent. To a certain extend, the important popularity of Fitness video games (such as Wii-Fit (22.5 Millions sold), Xbox Fitness(1.5 Millions workout in 1 month) Nike Kinect Training, Your Shape Fitness Evolve) demonstrate the market and user preference for virtual training compare to real-life one.
There are also numerous professional VR-based products for rehabilitations (such Virtual rehabilitation) or Fitness training (such systrain)

All in all, virtual training is said to provide better support for motivation, feedback and repetition. However, most of these applications are not immersive and are said to be inaccurate in terms movement recognition, and feedback (execution explanation and correction). Therefore, in this project we want to combine immersive avatar embodiment, virtual training and accurate motion tracking. We mainly expect to benefit from immersive avatar embodiment to improve user’s experience and performances. Meanwhile, the main aim of this project is to develop and evaluate a prototype in order to investigate the benefits and limitations of such system.

In order to develop such VR application, the student(s) will use the Unreal Development Kit (UDK) **and the **UnrealMe Framework. The UnrealMe Framework has been developed during previous students project here at the HCI Lab. It is built on the top the Unreal Game Engine 3.0 (UDK) and acts as a layer connecting motion tracker data and user’s avatar animation and interaction.
First, it replicates in real-time user’s gesture and motion onto his/her avatar using data coming from a motion tracking system. Then, it handles the avatar interactions with the virtual world, by allowing the user to grab/release or hit object. The system relies on the Nvidia PhysX engine to generate accurate physics collision detection and response. The framework is also adaptable, and can handle different type of motion tracking system (Kinect or any other VRPN-based motion tracking system such ioTracker, Optitrack Vicon,…)
The student(s) would receive a brief introduction to the framework features, configuration and architecture at the beginning of the project. The student(s) would then extend and improve the framework to satisfy the project’s task and requirements (see below).
In sum, the main project aims is to extend the UnrealMe framework to evaluate the impact of avatar embodiment on user’s experiences, performance and behaviour in the context of a sport training/therapy application
Task
-
UDK & UnrealMe self-training
- Develop a prototype with:
- [1..n] exercise(s)
- ifferent type of avatars
- Feedback system(i.e. correction, assistance ,motivation)
- Performance measurements system
- Design evaluation and experiment procedure
- Extend and improve UnrealMe Framework
Requirements
- UnrealMe framework provided
- Oculus Rift provided
- Oculus-ready UDK provided
- Motion tracking system (Kinect, Optitrack, IoTracker, Vicon…) provided
- Interaktiv Computer Graphics module preferred
- Games Developing module preferred
- Good understanding and experience of object-oriented programming (Java and/or C++)
- Previous experience with UDK and Unrealscript preferred
- Games Programming knowledge and experience preferred
- Knowledge of 3D user interfaces preferred
- Notion of 3D modelling and character animations preferred.
- Solid organisation skills and team player necessary.
- High motivation to develop novel type of VR application and interfaces.
Ansprechpartner
Dr. Jean-Luc Lugrin & Prof. Dr. Marc Erich Latoschik
Dr. Jean-Luc Lugrin
Human-Computer Interaction
Universität Würzburg
Am Hubland
D-97074 Würzburg
Phone: +49 931 31-81704
EMail: jean-Luc.lugrin@uni-wuerzburg.de
Room: 114 - Gebäude Informatik/Physik, Campus Süd
Contact Persons at the University Würzburg
Jean-Luc LugrinMensch-Computer-Interaktion, Universität Würzburg
jean-luc.lugrin@uni-wuerzburg.de