Dynamic Depth Mapping to Improve Stroke Accuracy in Virtual Reality Mid-Air Sketching Tasks
This project is already assigned.
Background
Depth cognition and distance estimation in Virtual Reality (VR) is problematic for many users (Jamiy and Marsh, 2019; Jones et al., 2008). This translates to difficulties with precisely sketching in immersive freehand sketching systems. Arora et al. (2017) found that visual impairment due to depth mis-perception led to inaccuracies in 3D sketching. When sketching in VR this depth error contributes to the challenges of distinguishing which drawn lines lay in the foreground and which are in the background. This could reduce the accuracy when targeting specific lines or when trying to connect two or more lines together. Tramper and Gilen (2011) found differences in the hand tracking and the tracing of participants in the depth plane compared to the frontal plane.
Li et al. (2022) approach the problem of depth estimation by introducing color as a depth cue. In their study users sat on a chair and drew sketches in front of themselves. We aim to introduce depth cues into the sketching environment, such as color or opacity changes, while retaining the holistic freedom of VR sketching and allowing users to move around their sketches to gain different perspectives. A comparative user study will be conducted to measure the influence of depth cues on the ability of users to accurately sketch and perceive strokes in a 3D environment. The goal of this work is to measure the influence of three depth cues, single-color, multi-color and opacity, on the accuracy of 3D sketching tasks, sketch quality and system usability.
Tasks
The project will focus on the following tasks:
- Acquaintance with the topic and the theoretical background
- Design and implementation of sketch assistance methods
- Comparative user study to evaluate the sketch assistance methods
- Evaluation and presentation of results
Prerequisites
- Introduction to Programming (Java, C++ or C#), e.g., 10-MCS-EinP
- Game Engine Experience (Unity)
- Knowledge of research methods and statistics
- Beneficial: Advanced Programming, e.g., 10-MCS-GADS, 10-MCS-EPP and 10-MCS-SPSE
- Beneficial: Usability and Software Ergonomics, e.g., 06-MCS-Usab
- Beneficial: Interactive Computer Graphics, e.g., 10-MCS-ICGV
Literature
- Armbrüster, C., Wolter, M., Kuhlen, T., Spijkers, W., & Fimm, B. (2008). Depth perception in virtual reality: distance estimations in peri-and extrapersonal space. Cyberpsychology & Behavior, 11(1), 9-15.
- Arora, R., Kazi, R. H., Anderson, F., Grossman, T., Singh, K., & Fitzmaurice, G. W. (2017, May). Experimental Evaluation of Sketching on Surfaces in VR. In CHI (Vol. 17, pp. 5643-5654).
- Behrendt, B., Berg, P., Preim, B., & Saalfeld, S. (2017). Combining Pseudo Chroma Depth Enhancement and Parameter Mapping for Vascular Surface Models. In S. Bruckner, A. Hennemuth, B. Kainz, I. Hotz, D. Merhof & C. Rieder (Hrsg.), Eurographics Workshop on Visual Computing for Biology and Medicine. The Eurographics Association.
- Heinrich, F., Bornemann, K., Lawonn, K., & Hansen, C. (2019). Depth Perception in Projective Augmented Reality: An Evaluation of Advanced Visualization Techniques. Proceedings of the 25th ACM Symposium on Virtual Reality Software and Technology
- Jamiy, F. E., & Marsh, R. (2019). Distance Estimation In Virtual Reality And Augmented Reality: A Survey. 2019 IEEE International Conference on Electro Information Technology (EIT), 063–068
- Jones, J. A., Swan, J. E., Singh, G., Kolstad, E., & Ellis, S. R. (2008). The Effects of Virtual Reality, Augmented Reality, and Motion Parallax on Egocentric Depth Perception. Proceedings of the 5th Symposium on Applied Perception in Graphics and Visualization, 9–14
- Li, Z., Cui, Y., Zhou, T., Jiang, Y., Wang, Y., Yan, Y., Nebeling, M., & Shi, Y. (2022). Color-to-Depth Mappings as Depth Cues in Virtual Reality. Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology
- Tramper, J. J., Gielen, C. C. A. M. (2011, May). Visuomotor Coordination Is Different for Different Directions in Three-Dimensional Space. Journal of Neuroscience. (21). (pp. 7857-7866).
Contact Persons at the University Würzburg
Samantha Monty (Primary Contact Person)Human-Computer Interaction, Universität Würzburg
samantha.monty@uni-wuerzburg.de
Prof. Dr. Marc Erich Latoschik
Human-Computer Interaction, Universität Würzburg
marc.latoschik@uni-wuerzburg.de