Human-Computer Interaction

Open topics


Filter

Augmented Reality vs. Virtual Reality Mirrors and Gaze Behavior: Exploring Body Image Perception ★
Hands-On Virtual Body Weight - Hand-Controlled Virtual Body Weight Manipulation in VR
Investigating How Motion Velocity Impact Virtual Body Weight Perception in VR ★
Investigating How Motion Velocity Impact Virtual Body Weight Perception in VR
Hands-On Virtual Body Weight - Hand-Controlled Virtual Body Weight Manipulation in VR ★
Hands-On Virtual Body Weight - Hand-Controlled Virtual Body Weight Manipulation in VR
Usability Evaluation of Enrollment System ★
The goal of this project is to perform a usability evaluation of the university's enrollment system to improve its overall quality.
Integrating human motion generation based on sparse tracking to unity for full body movement ★
Integrating state of the art prediction of human motion based on sparse tracking based on machine learning.
Out-of-Body Experience In XR ★
Developing and evaluating a XR application for inducing Out-of-Body Experience in XR .
Analysis of metadata and machine learning with currently largest dataset (16TB) ★
Extraction and analysis of metadata from the currently largest dataset (16TB), for training machine learning models for Person identification in VR
Evi-Grasp VR ★
Developing and evaluating a VR application for Water Bottle Design. The application should provide a way to quickly test the graspability of water bottles with virtual hands in VR.
Developing and Evaluating XR for Show Lighting System
Developing and Evaluating XR for Show Lighting System
Network Management meets Metaverse
Evaluate the current state of research on the Metaverse and related challenges for the telecommunications industry and develop and verify a virtual world considering existing visualization approaches and use cases!
Percept AI
In the percept AI project you will learn about human perception of artificial intelligence. You will build a virtual reality application to evaluate the effects of different appearances of AI on human perception.
Virtual Reality Games
This xtAI Lab introduces tools and workflows for designing and developing virtual environments based on the example of computer games. In group work, concepts, planning, design, creation, evaluation and refinement of a comprehensive VR application prototype are learned.
Motion Data Sensor Fusion
In the project MoSeF, we aim to improve the spacial and temporal performance of 3D motion capture by fusing the sensor data of optical, inertial and magnetic sensors, to be used in VR and AR applications.
The Effects of Familiarity of an Avatar
Does the familiarity with an avatar have an influence on the affective or emotional appraisal? How can familiar avatars be practically implemented to allow the elaboration of this research question?
Embodiment and Stroop Interference
Understanding the effects of hand proximity to objects and tasks is critical for hand-held and near-hand objects. Even though self-avatars have been shown to be beneficial for various tasks in virtual environments, little research has investigated the effect of avatar hand proximity on working memory.
Super-Hero Revolution
Super-Hero Revolution is an embodied VR experience. The user can observe his avatar in a virtual mirror while performing a series of transformations with the use of visual transition effects to produce a suggestion on the user, with consequent emotional release and mindfulness in a few seconds.
Virtual Control Room for Real Satellites
This project aims to create a virtual control room for real satellites. The satellite is not a simulation. We will use real telemetry data from real satellites in orbit.
Virtual Platelet Platform (VIPP)
The goal of this project is to design, implement and evaluate an immersive interface to interact with the repository data.
3DUI Development in Cooperation with RealVis
The goal of this project is to develop a solution that enables the interactive viewing of products and the context-related display of information. The viewer should be able to view the product via intuitive inputs and be given stylish information about the product in 3D space.
HATI: Haptic Angiography Training Interface
Optimization and evaluation of a training hardware interface for minimally invasive surgeries. The interface will measure wire movements applied in Angiography sessions. A small optical sensor similar to a mouse sensor captures wire movements at locks sealing aditus. The prototype has to be optimized and adjusted to real devices and the performance has to be evaluated.
Legal Information