Human-Computer Interaction

Bi-directional communication between real-time simulation engines using Simulator X and Unreal Engine 4


This project is already completed.

Motivation

Different real-time simulation engines obviously have different strengths and weaknesses. In order to combine their strengths while minimizing the impact of the respective weaknesses it is desirable for simulation engines to communicate and cooperate with each other.

Initially this research topic was motivated by Simulator X’s need for a state-of-the-art renderer. Therefore Unreal Engine 4 was chosen as the counterpart to combine with Simulator X, so that Simulator X content can be rendered by Unreal’s high quality renderer. However this research topic is not solely driven by this idea, but follows a more generalized approach. Therefore it is also desired to enrich Unreal Engine 4 content with Simulator X’s functionality (e.g. multimodal input processing).

Earlier approaches using the Unreal Engine as a foundation for immersive virtual reality (VR) applications have been made in the Human-Computer Interaction group. E.g. Lugrin and colleagues used Unreal Engine 3 to create a powerful VR middleware for CAVE automatic virtual environments (Lugrin, Charles, Cavazza et. Al. 2012) and to research body ownership in VR (Lugrin, Latt & Latoschik 2015).

Generally speaking game engines are often used when it comes to implementing CAVE applications. As well as the Unreal Engine both Unity (Jung, Krohn & Schmidt 2010) and CryEngine (Juarez, Schonenberg & Bartneck 2010) have been used for such projects. Another example for the use of Unity to create VR applications is a geographic information system Wang and colleagues implemented (Wang, Mao, Zeng et. Al. 2010). A holistic analysis of advantages and disadvantages of different 3D game engines in the context of VR has been conducted by Trenholme and Smith (Trenholme & Smith 2008). Amongst others this analysis takes a look at the CryEngine and an earlier version of the Unreal Engine.

Task

The starting point for this research is Simulator X’s Unity component which already provides basic functionality to render Simulator X content using the Unity game engine. The knowledge gained through the implementation of this component shall be used to create a similar component for Unreal rendering.

Technically speaking this means that there needs to be a basic protocol that the engines can use to communicate with each other (e.g. JSON). Furthermore both engines need to “speak the same language” in terms of events so that they are always aware of what their counterpart is doing. This “common language” should be based on an ontology since Simulator X already provides advanced ontology creation mechanisms. As soon as this requirement is met, it should be possible to render basic Simulator X content using Unreal.

A major goal is to make Unreal Blueprints (Unreal’s visual scripting language) accessible to Simulator X, so that future users are able to easily use the connection between the two engines. To create this kind of integration it is necessary to implement C++ logic in Unreal that provides the nodes for the visual scripting language, sends requests to and receives responses from Simulator X. One major area of Simulator X’s use is multimodal interaction. Therefore an example of this kind of integration is the use of Simulator X to predict gestures. It is imaginable that an Unreal Engine programmer/game designer might send tracking data from his application to Simulator X using a blueprint node and receive a prediction about which gesture the user performed.

As a proof of concept it is desired to setup one or more user experiments that use the aforementioned engine connection. An example for such a study would be a comparison between an existing Simulator X application (i.e. SiXtons Curse) rendered using the existing renderer and the same application being rendered by Unreal Engine 4.

Literature

Contact

Dennis Wiebusch | Telefon: 0931 31 886313 | E-Mail: dennis.wiebusch@uni-wuerzburg.de


Contact Persons at the University Würzburg

Dennis Wiebusch (Primary Contact Person)
Mensch-Computer-Interaktion, Universität Würzburg
dennis.wiebusch@uni-wuerzburg.de

Legal Information