Human-Computer Interaction

Biohybrid Realities - Playing It Real!


This project is already completed.

Background

The EU-project Flora Robotica [1] explores the potential of bringing together plants and robots. Simple sensory-motor robotic units can water and groom the plants, provide them with scaffoldings or direct their growth, for instance by shining or shielding off light. Based on such interactions, the plants’ growth can be directed around the clock and for long periods of time. As a result, completely new forms of plant-based artefacts and architectures may emerge. The prospect of years, decades, potentially even centuries of growth and evolution of biohybrid systems requires us to run computational simulations to visualise and plan the outcome over time. The Biohybrid Realities project aims at making such simulations possible. Previously, we have shown how an augmen- ted reality interface for biohybrid system visualisation and planning could work [2]. In this context, we have deployed a very rudimentary developmental plant model [3], which we have furthered to not only incorporate phototropism (growth towards light) but also lignification (the stem becoming increasingly rigid), shade avoidance and other plant behaviours.

Tasks

In this work, the student sets out to develop and evaluate a new version of the augmented reality biohybrid system that works on mobile phones and tablets. Of course, existing code can be reused but that is not mandatory, neither is the development platform. The constituents of the AR biohybrid system are: (1) The implementation of a simple plant-growth model or the integration of a provided, more sophisticated one, (2) the implementation of simple, physically represented and behaviourally active robotic units, (3) the augmented reality interface that allows the user to plant and configure robots, and to control the simulation to see how the biohybrid system would unfold in the future. The recognition of the environment to enable augmentation can be as simple as the use of QR codes or as advanced as using detailed environmental depth scan relying on according hardware such as a Bridge Headset or a Microsoft HoloLens—whichever is available at the beginning of the project work. The Biohybrid Realities project is conducted in close collaboration with our partners from the Universities of Lübeck (Prof. Hamann), Graz (Prof. Schmickl) and the California State University (Prof. Pietroszek). Their feedback during the project work will open great opportunities for the student to present and hone his/her work. Overall, the following steps will ensure a fruitful project: (1) Research on swarm grammars, (2) study of the existing code bases, (3) implementation design and realisation, (4) AR usability studies, (5) presentation and demo to our partners.

Prerequisites

A background in computer graphics, AR, and game engine programming is a great asset for this work.

References

[1] Heiko Hamann. **flora robotica Societies of Symbiotic Robot-Plant Bio-Hybrids as Social Architectural Artifacts** Homepage. http://www.florarobotica.eu/, December 2015.

[2] Sebastian von Mammen, Heiko Hamann, and Michael Heider. Robot gardens: an augmented reality prototype for plant-robot biohybrid systems. In Proceedings of the 22nd ACM Conference on Virtual Reality Software and Techno- logy, pages 139–142, München, 2016. ACM.

[3] Sebastian von Mammen and Christian Jacob. The evolution of swarm grammars: Growing trees, crafting art and bottom-up design. IEEE Computational Intelligence Magazine, 4:10–19, August 2009.


Contact Persons at the University Würzburg

Sebastian von Mammen (Primary Contact Person)
Mensch-Computer-Interaktion, Universität Würzburg
sebastian.von.mammen@uni-wuerzburg.de

Legal Information