Human-Computer Interaction

Evi-Grasp VR


This call for a thesis or project is open for the following modules:
If you are interested, please get in touch with the primary contact person listed below.

Background

The Danone Research Packaging Centre Housed in the Evian Water plan is developing new materials for bottles that are attractive, ergonomic and ecologically sound from a sustainable development perspective (see Figure 1 and (more here)).

Figure 1. In 2018, evian pledged to become a circular brand by 2025 and make all of its plastic bottles from 100% recycled materials (rPET) by 2025- Danone’s carbon neutrality journey that evian has taken by continuously measuring and reducing carbon emissions at each stage of a bottle’s life cycle: from the materials used, to production, transportation and recycling.

However, in the design phase, it is often difficult to realize what would be the position of the hand and fingers while grasping a bottle without making mockups or real samples, especially when experimenting with new environment-friendly material. In addition to that, a wide panel of hands will manipulate the bottles therefore the prehension should be suitable for small or large hands and left-handed or right-handed people.

Therefore, the project consists of developing a prototype of a VR-based Tool called “Evi-Grasp” capable of simulating precise water bottle grasping and manipulation in VR. The application should provide a way to quickly test the graspability of water bottles with virtual hands in VR. As illustrated by Figures 2 and 3, the system should decide how fingers and thumb would be positioned around a bottle while grasping it. The model should take into consideration different hand sizes and propose a choice between left or right.

Figure 2. Interpenetration of the real hand (yellow nodes) in the virtual object, the real hand is inside the red object [Delrieu et al, 2020]
The possible penetration of the real tracked hand in the virtual object remains a major issue. Indeed, this penetration prevent us from meaningfully determine neither the hand configuration nor the contact points on the object surfaces leading to unrealistic and unstable manipulation
Figure 3. Grasp showing tracked hand (mesh) that sank into the virtual object and virtual hand (solid) that remained at the object's surface [Prachyabrued et Borst , 2011]

In addition to the virtual display of the hand, it would be nice to have the cartography of the pressure zones to apply them as an input parameter for FEA simulations (i.e. [Finite Element Analysis Simulation])(https://www.simscale.com/docs/simwiki/fea-finite-element-analysis/what-is-fea-finite-element-analysis/) depicted via a color scale (or Heatmap) that shows, for example, the pressure distribution over the object to find weak spots or areas of tension which make the bottle cracks or splits and leaking water (See Figure 4 below for an example of such heatmap when squeezing a bottle).

Figure 4. Heatmap on the bottle shows the stress load (in green) for a plastic bottle

Task

The goal of this project is the development and evaluation of the first version of Evi-Grasp, a VR system helping to evaluate the graspability of a water bottle in VR without using a haptic data glove (i.e. with 3D controllers or hand tracking). Following the work of [Delrieu et al, 2020], [Prachyabrued et al, 2011], [Pitarch, 2008] and [Borst and Indugula, 2005], the system will provide an automatic virtual hand animation using a physically-based approach of grasping and manipulation (aka auto-posing as illustrated in Figure 6 and 7). The system will also be able to determine whether a grasp is feasible or not, and apply different finger pressures without using haptic force feedback device such as VR Data Glove (e.g. TeslaSuiteGlove, BeBop Data Glove, Cyber Grasp).

Ideally, the application should also include configurable and modifiable virtual hands (e.g different sizes, genders, ages), as well as the possibility to easily import different bottle designs and visualize data from FEA simulations.

Figure 6. Example of Auto-posing VR grasping hands using Unreal Engine 4 - The virtual hand shape automatically adapts to the object shape
Figure 7. Example of Auto-posing VR grasping hands using Oculus Handtracking with [Gleechi VirtualGrasp](https://www.gleechi.com/virtualgrasp)- The virtual fingers automatically adapts to the object shape

The task is broken down as follows:

  1. Sum-up related work on VR Virtual Hand Simulations and existing tools/libraries (e.g. Gleechi Virtual Grasp)
  2. Identify favorable approaches (techniques, target factors, etc.) from the related work.
  3. Design an own approach for proof-of-concept development.
  4. Implement and test your own approach.
  5. Evaluate with industrial partners.
  6. Provide Guidelines and Improvements List for the next prototype.

The described tasks can be carried out by students of any programs of Human-Computer Interaction, Games Engineering, Computer Science, or Human-Computer Systems. The extensions of the task and its main focus will be tailored for the respective program of study and final goal (project, BSc or MSc thesis), i.e., depending on the workload of the tasks 1.-4., the overall progress, and the scope (e.g., bachelor degree), tasks 5. and 6. will not be part of the overall assignment but will be assigned as individual independent task(s).

The task has the additional benefit to work on a real-world problem together with our industrial partner from Danone Research Center Housed in the Evian Water plan. Therefore, travels to Evian (France) for the project kick-off and validation of the project may be envisaged.

Specific Hardware Requirement

Prerequisites

Favorable

Experience in one or several of

References

T. Delrieu, V. Weistroffer and J. P. Gazeau, “Precise and realistic grasping and manipulation in Virtual Reality without force feedback,” 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 2020, pp. 266-274, doi: 10.1109/VR46266.2020.00046.

M. Prachyabrued and C. W. Borst, “Dropping the ball: Releasing a virtual grasp,” 2011 IEEE Symposium on 3D User Interfaces (3DUI), Singapore, 2011, pp. 59-66, doi: 10.1109/3DUI.2011.5759218.

Peña Pitarch, Esteve. Virtual human hand: Grasping strategy and simulation. Universitat Politècnica de Catalunya, 2008.

C. W. Borst and A. P. Indugula, “Realistic virtual grasping,” IEEE Proceedings. VR 2005. Virtual Reality, 2005., Bonn, Germany, 2005, pp. 91-98, doi: 10.1109/VR.2005.1492758.


Contact Persons at the University Würzburg

Jean-Luc Lugrin (Primary Contact Person)
Human-Computer Interaction, Universität Würzburg
jean-luc.lugrin@uni-wuerzburg.de

Romain Savajano
Danone Water Research
romain.savajano@danone.com.

Legal Information