Human-Computer Interaction

INTO3D - Immersive Programming


This project is already completed.

Background

Visual programming is a powerful approach to creating and maintaining complex software systems. The flow of data is defined by connecting processing nodes that provide various functionalities. INTO3D (interaction-oriented 3D modelling & simulation) takes visual programming to the next level [3]. It projects the programming logic right into 3D spatial models and thereby amalgamates the modelling and the simulation space. Blending these two worlds is an important aspect for understanding the dynamics of complex systems as it immediately reflects how the components of a system interact and how global system properties emerge. With the rise of VR systems on the mass market, the immersive perspective offered by INTO3D becomes a key enabler for programming complex 3D simulations and games. It adds the behavioural component to merely geometric VR editors as released by major engine developers [1, 2].

Tasks

This project comprises the development and evaluation of an immersive VR user interface for the existing Unity3D implementation of INTO3D. Concrete interaction tasks need to be designed and refined to perform (1) placement, nesting and configuration of INTO3D operators, for (2) setting up and removing links between their connectors, for (3) navigating the scene, and for (4) controlling the simulation. In order to evaluate the immersive VR interface of INTO3D, usability tests have to be conducted that compare the development performance with traditional visual programming interfaces such as the Graph Development Interface for Unity3D. The complexity of the built models should be increased stepwise and shed light on various aspects of the structural artefacts of the realised programming code including hierarchies and call-callee relationships.

Prerequisites

A background in computer graphics, VR, and Unity3D programming is a great asset for this work.

References

[1] Epic Games, Inc. Unreal engine vr editor. https://docs.unrealengine.com/latest/INT/Engine/Editor/VR/, December 2016.

[2] David Poirier-Quinot, Damien Touraine, and Brian Katz. Blendercave: A multimodal scene graph editor for virtual reality. In International Conference on Auditory Display (ICAD), pages 223–230, 2013.

[3] Sebastian von Mammen, Stefan Schellmoser, Christian Jacob, and Jörg Hähner. The Digital Patient: Advancing Medical Research, Education, and Practice, chapter 11. Modelling & Understanding the Human Body with Swarmscript, pages 149–170. Wiley Series in Modeling and Simulation. John Wiley & Sons, Hoboken, New Jersey, 2016.


Contact Persons at the University Würzburg

Sebastian von Mammen (Primary Contact Person)
Mensch-Computer-Interaktion, Universität Würzburg
sebastian.von.mammen@uni-wuerzburg.de

Legal Information