Human-Computer Interaction

INTO3D - Smart Environments


This project is already completed.

Background

Visual programming is a powerful approach to creating and maintaining complex software systems. The flow of data is defined by connecting processing nodes that provide various functionalities. INTO3D (interaction-oriented 3D modelling & simulation) takes visual programming to the next level [5]. It projects the programming logic right into 3D spatial models and thereby amalgamates the modelling and the simulation space. Blending these two worlds is an important aspect for understanding the dynamics of complex systems as it immediately reflects how the components of a system interact and how global system properties emerge. With the rise of VR systems on the mass market, the immersive perspective offered by INTO3D becomes a key enabler for programming complex 3D simulations and games. It adds the behavioural component to merely geometric VR editors as released by major engine developers [2, 4].

Tasks

This project focusses on adapting the existing Unity3D implementation of INTO3D for smart environment domains such as smart homes [3] or smart cities [1]. In particular, a comprehensive model scenario has to be created in 3D and made accessible in VR. Interaction dynamics at the component level (e.g. the behaviour of the fridge) and at the system level (e.g. the energy consumption of the house) need to be visualised and the user needs to be empowered to interfere by re-programming individual system components or sets thereof. Overall, the following steps will ensure a fruitful project: (1) Research on smart environments, (2) study of the existing code base, (3) comprehensive sketch of a smart environment model, (4) crafting of a 3D model scenario, (5) programming of required component interaction primitives, (6) realisation of the comprehensive immersive model scenario, (6) VR case studies, (7) presentation and demo of the achieved results.

Prerequisites

A background in computer graphics, VR, and Unity3D programming is a great asset for this work.

References

[1] Michael Batty, Kay W Axhausen, Fosca Giannotti, Alexei Pozdnoukhov, Armando Bazzani, Monica Wachowicz, Georgios Ouzounis, and Yuval Portugali. Smart cities of the future. The European Physical Journal Special Topics, 214(1):481–518, 2012.

[2] Epic Games, Inc. Unreal engine vr editor. https://docs.unrealengine.com/latest/INT/Engine/Editor/VR/, December 2016.

[3] Sang Hyun Park, So Hee Won, Jong Bong Lee, and Sung Woo Kim. Smart home–digitally engineered domestic life. Personal and Ubiquitous Computing, 7(3-4):189–196, 2003.

[4] David Poirier-Quinot, Damien Touraine, and Brian Katz. Blendercave: A multimodal scene graph editor for virtual reality. In International Conference on Auditory Display (ICAD), pages 223–230, 2013.

[5] Sebastian von Mammen, Stefan Schellmoser, Christian Jacob, and Jörg Hähner. The Digital Patient: Advancing Medical Research, Education, and Practice, chapter 11. Modelling & Understanding the Human Body with Swarmscript, pages 149–170. Wiley Series in Modeling and Simulation. John Wiley & Sons, Hoboken, New Jersey, 2016.


Contact Persons at the University Würzburg

Sebastian von Mammen (Primary Contact Person)
Mensch-Computer-Interaktion, Universität Würzburg
sebastian.von.mammen@uni-wuerzburg.de

Legal Information