Human-Computer Interaction

Augmenting Professional Light Show Design


This project is already assigned.

Thesis

1 Motivation and Objectives

Professional stage lighting workflows rely on speed, precision, and well-trained motor skills developed through years of using physical lighting consoles (Branton & Huizinga, 2025; Cadena, 2017). While immersive environments enable spatial inspection and contextual evaluation of lighting designs, controller-based VR interaction can reduce efficiency and disrupt established professional routines (Schwarz, 2023; Simpson, 2021).

Preliminary feedback from professional users indicates a strong preference for continuing to use a haptic lighting console in order to preserve interaction speed and established working habits. At the same time, users value the spatial advantages of immersive visualization, such as evaluating lighting setups from multiple viewpoints within a virtual stage context.

The objective of this master thesis is therefore to design an Extended Reality (XR) workflow that preserves the speed and familiarity of physical console interaction while retaining the spatial benefits of immersive environments. Concretely, the work focuses on three central aspects of XR support for professional lighting design: (1) navigation within the virtual stage space, (2) manipulation of lighting parameters through established workflows, and (3) visualization of the resulting lighting states in an immersive context. To achieve this, the thesis explores an an Mixed Reality (MR) based passthrough approach that allows users to operate their real lighting console while simultaneously perceiving and interacting with a virtual stage scene.

2 State of the Art and Preliminary Work

2.1 Preliminary Work

This master thesis builds on a previously developed VR-based prototype for professional stage lighting control, created in the context of an HCI project. The prototype enabled DMX-compatible control of multiple moving head fixtures within a virtual stage environment and focused on core lighting parameters such as pan and tilt, color, and dimmer. Interaction was realized through controller-based techniques, including hybrid cone selection, spatial point-to-target orientation, and world-anchored parameter controls.

The prototype demonstrated that immersive environments can effectively support spatial understanding of lighting setups and allow designers to evaluate illumination from multiple viewpoints within a virtual venue (Abdelaal et al., 2022; Schwarz, 2023; Spadoni et al., 2023). In particular, the ability to freely navigate the scene and inspect lighting results in context was perceived as a clear advantage over conventional preview methods. To better understand the suitability of the VR-based interaction approach for professional use, an initial evaluation was conducted with lighting professionals (Lazar et al., 2017). While participants generally acknowledged the conceptual potential of immersive lighting control, the evaluation revealed a critical limitation: compared to traditional lighting consoles, controller-based VR interaction was perceived as significantly slower and less efficient. Users emphasized the importance of a quick and efficient workflow that provides tactile feedback, fosters muscle memory, and uses established working habits associated with physical lighting consoles. These findings indicate that while VR interaction can convey control and spatial context, it does not sufficiently preserve the speed and familiarity required in professional lighting workflows. At the same time, the evaluation highlighted that immersive visualization itself is highly valued. This tension between the long-trained console-based operation and the spatial benefits of immersive environments directly motivates the shift toward a hybrid approach explored in this master thesis, which explicitly aims to build on existing console expertise while combining physical console interaction with immersive visualization through AR.

3 Planned System and Methods

The focus lies on exploring how immersive visualization can be integrated into existing professional lighting workflows without disrupting speed, familiarity, and established interaction practices

3.1 Hybrid Workflow Concept

Planned time-schedule.

Figure 3.1. Mockup of a hybrid setup with real console and AR-overlay of virtual stage.

The core concept of the proposed system is a hybrid interaction workflow that deliberately preserves the physical lighting console as the primary input device. Rather than replacing established tools, the immersive system is designed to extend established workflows by providing additional visualization that is not available otherwise (Milgram & Kishino, 1994; Rixon, 2024; Speicher et al., 2019). In this hybrid approach, the physical console remains responsible for precise parameter control, while the immersive environment visualizes lighting results within a virtual stage, enabling spatial evaluation without leaving the console context(Rixon, 2024; Schwarz, 2023). The environment expands this interaction by visualizing lighting results within a virtual stage, enabling users to evaluate illumination from multiple viewpoints.

The hybrid workflow further enables hands-free interaction within the immersive environment. Since primary control is performed on the physical lighting console, users are not required to hold VR controllers during operation. This frees the hands for lightweight gesture-based interaction. Gestures are therefore designed to complement console-based input by supporting tasks that benefit from embodied interaction (Billinghurst & Kiyokawa, 2016). In this way, gesture interaction extends the advantages of immersive visualization without interfering with established professional workflows.

3.2 Pilot Feasibility Study of Passthrough Use

Before the main empirical study, a small pilot evaluation will examine how well users can operate a physical lighting console while wearing the mixed reality headset in passthrough mode. The goal is to assess the basic feasibility, comfort, and usability of console work under MR conditions, including potential issues such as reduced visual clarity of the console surface, impaired depth perception, or increased cognitive load. The pilot will involve a small number of users who perform representative console programming tasks while basic interaction data and interviews on perceived usability, visual comfort, and interference with established workflows are collected. Insights from this pilot will inform adjustments to the visual design of the passthrough overlay, and, if necessary, hardware or environmental conditions, ensuring that the subsequent main study focuses on the hybrid workflow rather than basic visibility issues.

3.3 Evaluation Approach

The evaluation targets professional lighting designers and operators who regularly engage in preprogramming workflows away from the actual venue (Bartindale, 2014; Cadena, 2017; Schwarz, 202). A controlled comparative study is planned in which participants complete a standardized lighting task (e.g., programming a short music segment) using two workflow conditions: a conventional setup (physical console with default visualization) and the proposed hybrid MR-assisted setup (physical console combined with immersive visualization). In addition to the main comparison of workflows, the study will also revisit key aspects of passthrough usability with professional users, assessing whether operating the physical console under MR conditions is acceptable in terms of visual clarity, depth perception, and overall workload. Given the expected small sample size of professional participants, the study design will aim to ensure methodological soundness while remaining feasible in an expert context. Both within-subject and between-subject designs are considered. The final study design will be determined based on practical constraints and insights gained from pilot testing.

The evaluation focuses on three complementary dimensions: workflow integration, professional acceptance, and workflow performance. This structure reflects how XR technologies have been shown to influence professional practices in domains such as architecture and surgery (Abdelaal et al., 2022; Spadoni et al., 2023; Zhang et al., 2023).

Workflow integration addresses whether the hybrid setup can be reliably integrated into professional preprogramming workflows without introducing technical or operational friction, including the practical usability of operating the physical console via passthrough. Professional acceptance examines subjective experience, perceived usefulness, and the willingness of professionals to adopt the system within their established workflows, complemented by qualitative feedback from expert participants.

Workflow Performance is assessed not solely in terms of task completion time, but as the relationship between effort, creative outcome quality, and perceived production readiness. Selected quantitative measures (e.g., task completion time, iteration counts) are therefore combined with self-reported creative satisfaction, confidence, and qualitative reflections to assess whether the hybrid approach successfully preserves professional working speed while providing the benefits of immersive visualization.

Work plan

The thesis is planned over a period of six months and follows a four-phase research process.

Bibliography

Abdelaz, M., Amtsberg, F., Becher, M., Estrada, R. D., Kannenberg, F., Calepso, A. S., & Weiskopf, D. (2022). Visualization for architecture, engineering, and construction: Shaping the future of our built world. IEEE Computer Graphics and Applications, 42(2), 10–20. (cit. on pp. 2, 6).

Bartindale, T. (2014). Interaction design for situated media production teams [Doctoral dissertation]. Newcastle University. (Cit. on p. 6).

Billinghurst, M., & Kiyokawa, K. (2016). Collaborative augmented reality. 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 1–11 (cit. on p. 5).

Branton, A., & Huizinga, S. (2025). Lighting design beyond theatre: A process for the evolving entertainment industry. CRC Press. (Cit. on p. 1).

Cadena, R. (2017). Automated lighting: The art and science of moving and color-changing lights. Routledge. (Cit. on p. 6).

Lazar, J., Feng, J. H., & Hochheiser, H. (2017). Research methods in human-computer interaction. Morgan Kaufmann. (Cit. on p. 2).

Milgram, P., & Kishino, F. (1994). A taxonomy of mixed reality visual displays. IEICE Transactions on Information and Systems, E77-D(12), 1321–1329 (cit. on p. 5).

Rixon, T. (2024). Balancing light across multiple realities: Interrogating the potential of light within mixed reality scenography. International Journal of Performance Arts and Digital Media, 20(3), 484–506 (cit. on p. 5).

Schwarz, D. A. (2023). Pre-visualising progress: A new paradigm for the intuitive design and control of theatrical and stage lighting [Doctoral dissertation]. Manchester Metropolitan University. (Cit. on pp. 1 sq., 5 sq.).

Simpson, J. (2021). Live and life in virtual theatre: Adapting traditional theatre processes to engage creatives in digital immersive technologies. Proceedings of EVA London 2021, 109–116 (cit. on p. 1).

Spadoni, E., Bordegoni, M., Carulli, M., & Ferrise, F. (2023). Extended reality in industry: Past, present and future perspectives. Proceedings of the Design Society, 3, 1845–1854 (cit. on pp. 2, 6).

Speicher, M., Hall, B. D., & Nebeling, M. (2019). What is mixed reality? Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (cit. on p. 5).

Zhang, J., Lu, V., & Khanduja, V. (2023). The impact of extended reality on surgery: A scoping review. International Orthopaedics, 47(3), 611–621 (cit. on p. 6).


Contact Persons at the University Würzburg

Dr. Jean-Luc Lugrin (Primary Contact Person)
Human-Computer Interaction, Universität Würzburg
jean-luc.lugrin@uni-wuerzburg.de

Maximilian Landeck
MA Lighting Technology GmbH
maximilian.landeck@malighting.de

Legal Information