Human-Computer Interaction

UbiquX


This project is already completed.

Background

Modern portable computer systems, like Smartphones or Tablets, include a variety of sensors, such as cameras, GPS, or gyroscopes. In addition, the current mobile radio coverage permits these devices to establish internet connections in many locations, including inner-city areas or even remote locations. This allows the creation of pervasive/ubiquitous games, games that require the player to interact with the real world to proceed in the game. Most pervasive mobile games today use QR codes to link real and virtual items and places, like “Codename Heroes” by Back and Annika, 2014. Even though they are widely used, QR-codes have a few drawbacks:

  1. They have to be placed in the real world and checked from time to time, making it difficult and expensive to create and maintain a large game world.
  2. They can’t be placed everywhere, specially at Attractions that are particularly interesting for pervasive games like Castles, Churches and Forests it is often forbidden.

As a way around the limitations of QR-codes we want to integrate a system, that uses natural markers instead of QRs. This will allow for greater scalability and lower maintenance cost. This system will use GPS to guide players to photospots and a gyroscope to help them replicating the picture by assisting them to find the right angles (Fendt, 2014).

Project Goals

UbiquX, short for “UBIQUitous eXperience” is an MMORPG (Massive-Multiplayer-Online-Role-Playing-Game) that is played in the real and virtual world (mixed reality and Ubiquitous Computing). Players can scan QR-Codes or search for points, which are placed in the real world, to trigger events they play with their virtual character. In addition to this, every player can place Items and traps/guards anywhere and by that generate new events for others to play (user generated content). This way the library of photospots will keep growing, allowing Authors/Admins to add new contend without having to search for new spots or eventually a modified “place item+trap” mechanic in an author version of the app.

Goal of our project is to develop an online game in form of an Android-App, based on our previous work from the programming course (video). The game logic is based on the tabletop-RPG “Quest”. In addition to the QR-scanner we will implement picture recognition function, based on the Bachelor project “Konzeption und Implementierung eines Geo-Cache-basierten Spiels für die Android-Plattform” by Michael Fendt, 2014.

Mandatory requirements

Optional requirements

References

Jon Back, Annika Waern, Codename Heroes-Design for Experience in Public Places in a Long Term Pervasive Game, In Foundation of Digital Games. 2014.

Martin Fischbach, Jean-Luc Lugrin, Marc Erich Latoschik, Michael Fendt, Picture-based Localisation For Pervasive Gaming, In Virtuelle und Erweiterte Realität, 11. Workshop der GI-Fachgruppe VR/AR. Springer, 2014. To appear.

Martin Fischbach, Chris Zimmerer, Anke Giebler-Schubert, Marc Erich Latoschik, DEMO Exploring Multimodal Interaction Techniques for a Mixed Reality Digital Surface, In IEEE International Symposium on Mixed and Augmented Reality ISMAR. IEEE, 2014. To appear.

Supervisor

M.Sc. Martin Fischbach
Telefon: 0931 31 86314
E-Mail: martin.fischbach@uni-wuerzburg.de


Contact Persons at the University Würzburg

Martin Fischbach
Mensch-Computer-Interaktion, Universität Würzburg
martin.fischbach@uni-wuerzburg.de

Legal Information