Motion Data Sensor Fusion
This call for a thesis or project is open for the following modules:
If you are interested, please get in touch with the primary contact person listed below.
Background
Relying on a single sensor type for capturing data renders the application prone to intrinsic noise and drift of the sensors. Smartphones do overcome this by fusing information of GPS position and measurable physical quantities like acceleration or rotation to improve the location when using it as a route guidance system. There are two questions that have to be answered. At first, how does one decide about the confidence in the provided sensor data on spatial and temporal performance and secondly how can they be joined to improve the overall prediction.
In this project, we aim to replicate the behavior from the example above for motion capture and its application. This is beneficial where spatial drift and offset as well as latency are causing problems, like VR and AR applications. The first approach of MoSeF is going fuse data coming from an optical motion capture system together with positional data from a virtual reality headset by registering their two data streams together. But it will not stop there. In a later stage, it is going to evolve towards a generalized engine, filtering multiple data streams at once and providing the adjusted data to the application above. The goal is, to refine physical data, where necessary, by adding sensors to the setup and pipe their data stream into the fusion algorithms.
Tasks
The project will focus on the following tasks:
- Recording of data for evaluation
- Literature research on sensor fusion
- Design and implementation of the fusion algorithms
- Comparative study between the different solutions
- Evaluation and presentation of results
Prerequisites
- Introduction to Programming (Java, C++ or C#), e.g., 10-MCS-EinP
- Introduction to MCS, e.g., 06-MCS-GL-AP
- Statistics 1 and 2, e.g., 06-PSY-STAT-1 and 06-PSY-STAT-2
- Advanced Programming, e.g., 10-MCS-GADS, 10-MCS-EPP and 10-MCS-SPSE
- Interactive Computer Graphics, e.g., 10-MCS-ICGV
- Software Quality, e.g., 10-MCS-SQ and 10-MCS-ST
Optional
- Game Engine Experience (Unity)
- Optitrack Motion Capture System Experience
Contact Persons at the University Würzburg
Marc Erich LatoschikMensch-Computer-Interaktion, Universität Würzburg
marc.latoschik@uni-wuerzburg.de
Sebastian Oberdörfer
Mensch-Computer-Interaktion, Universität Würzburg
sebastian.oberdoerfer@uni-wuerzburg.de
Matthias Popp (Primary Contact Person)
Mensch-Computer-Interaktion, Universität Würzburg
matthias.popp@uni-wuerzburg.de