Immersive Teleoperation & Data Collection for Humanoid Robotics
This call for a thesis or project is open for the following modules:
If you are interested, please get in touch with the primary contact person listed below.
Immersive Teleoperation & Data Collection for Humanoid Robotics
Motivation
Enter the world of humanoid robotics with jarm.ai GmbH. Humanoid robotics is close to moving from demonstrations to repeatable real-world deployments.
One key bottleneck is teleoperation: Humans must be able to control robots reliably, both to enable high-quality remote interaction and to efficiently collect training data.
jarm.ai develops humanoid robots, currently upper body devkits and a software platform (jarmOS)
as a base layer for rapid prototyping and partner use cases.
As part of these theses, VR/AR-based teleoperation interfaces and a reproducible data collection workflow will be developed and
evaluated.
This serves as a foundation for future HCI lab studies and real robot control.
Details
- Start: flexible, but coordinated in case Topic A and B are both filled
- Duration: thesis duration according to the applicable study regulations (bachelor or master)
- Location: University of Würzburg (lab/studies) + coordination with jarm.ai (remote, biweekly)
Requirements
- Interest in XR (VR/AR), HCI methodology, and/or robotics-related interaction
- Solid programming skills (e.g., Unity / VR experience or comparable background)
- Motivation for prototypical system development + structured evaluation
Topic A: Immersive Teleoperation Interface for Controlling Humanoids (VR/AR)
Goal:
Design and evaluate intuitive teleoperation interfaces (e.g., point-to-move, direct hand control, freehand mode, …) and compare fully immersive VR approaches with passthrough AR.
Outcome:
A validated, flexible teleoperation UI suitable for future HCI lab studies and, prospectively, for real robot control.
Tasks: [Example scope, final definition upon alignment]
- Literature and system review (teleoperation, immersion, embodiment, assistance functions)
- Prototyping of interaction modes (VR and/or passthrough AR)
- Definition of measurable criteria (e.g., task time, errors, NASA-TLX, precision, subjective control/trust)
- User study (lab-based, Wizard-of-Oz) with suitable comparison conditions
- Documentation + recommendations (“design guidelines”) for teleoperation (general or for the selected target group)
Topic B: End-to-End UX & Toolkit for Robot Data Collection (AR/VR)
Goal:
Develop a passthrough AR/VR interface that logs interaction data (e.g., trajectories, gaze data, events), plus evaluate the workflow for researchers (setup → logging → export).
Outcome:
A reusable, optimized data pipeline + UX guidelines that encourage labs to conduct experiments with jarm.ai hardware.
Tasks: [Example scope, final definition upon alignment]
- Define a “minimal viable logging” concept: Which data is required, which data is optional?
- UX design for the research workflow (setup/calibration → recording → annotation → export)
- Implementation of a logging/export format (structured sessions/episodes, metadata, events)
- Evaluation of the workflow with researchers/students (usability, friction points, reproducibility)
- Result documentation + “how-to” for repeatable data collection
Collaboration
The topics are scoped so that two students can work in parallel. Both students would have clearly
separated focuses and a shared integration goal:
• Student 1: Teleoperation interaction & UI (VR/AR) + user study
• Student 2: Data collection toolkit + logging/export + workflow evaluation
Both students work on a shared foundation (shared repository/assets), support each other, and
deliver compatible results (tele-op ↔ logging).
Contact Persons
Prof. Dr. Marc Erich Latoschik (Primary Contact Person)
Human-Computer Interaction, Universität Würzburg
marc.latoschik@uni-wuerzburg.de
Jaboc Munke (Primary Contact Person)
jarm.ai
jacobmunke@jarm.ai
Jeshwitha Raja
jarm.ai
jeshma@jarm.ai
Contact Persons at the University Würzburg
Prof. Dr. Marc Erich Latoschik (Primary Contact Person)Human-Computer Interaction, Universität Würzburg
marc.latoschik@uni-wuerzburg.de