Human-Computer Interaction

Assessment of the user state in VR healthcare applications


This project is already completed.

Background

Within this project you can leverage physiological and/or behavioral data to determine the current state of the user in a virtual environment. Different AI algorithms can be deployed in this context. In particular, the focus here will be on capturing the user state in healthcare applications, i.e. mainly therapy and rehabilitation. As a basis for the stimulus, applications of projects can be used which are already available at the chair, i.e. ViTras, Virtual Audiences, ILAST, VR Gait etc. Typical user states that could be assessed here are stress, anxiety, cognitive workload or physical exertion.

Since this is about therapy and rehabilitation, the focus can also be on visualizing sensor data or classification results so that a supervisor of a VR application (e.g. a therapist) can benefit.

A wide variety of data types come into question here. For example heart rate, electroencephalogram(EEG), accelerometer, eye tracking, skin conductance, motion data etc. Machine learning or other classification algorithms can be used to understand relations between psychological data patterns and user’s state which depend on the experience in virtual environment.

Within this project you can come to us with your own ideas, but we can also make suggestions for concrete work.

If you would like to read more about the usage of physiological data in therapy and rehabilitation applications have a look at this paper.

VIA VR

The work would be embedded in the VIA VR project. Here is a trailer that depicts some goals of the project.

Tasks

Prerequisites / Favorable


Contact Persons at the University Würzburg

Murat Yalcin (Primary Contact Person)
Human-Computer Interaction, Universität Würzburg
murat.yalcin@uni-wuerzburg.de

Andreas Halbig
Human-Computer Interaction, Universität Würzburg
andreas.halbig@uni-wuerzburg.de

Legal Information