Human-Computer Interaction

Developing a Framework to Analyze Handwriting Text Input on a Tablet


This project is already assigned.

Motivation

This project has the goal to develop a framework which allows the evaluation of handwriting text input on a tablet (e.g., Apple iPad Pro with Apple Pencil).

The recent growth in the market for augmented reality (AR) devices, as well as their strong focus on knowledge- and office work [1, 2] have created a new need for input methods (IMEs) that go beyond the long established combination of keyboard and mouse. Current IMEs are usually designed to be used in a sitting environment at a desk (i.e., the physical keyboard), while AR devices promote working in new, more flexible environments, such as freestanding, on couches or in the outdoors - the leading manufacturers’ marketing materials shows a clear trend of “working away from the desk”.

The increased adoption of Visual-See-Through AR (VST AR) comes with a couple of advantages compared to Virtual Reality (VR) applications: In a VST AR environment, users primarily see their actual surroundings, augmented with additional virtual content. This makes it possible to interact with both, the real world and digital media, at the same time, which in turn allows collaborative work with people in the same room as well as in remote locations. This also makes work in new environments more attractive, because users are still aware of their real surroundings.

Inspired by the possibility to use the on-screen keyboard of a smartphone as an IME for “display only” devices such as a GoogleTV [3] or an Android Auto based car’s infotainment system, this project will focus on using a tablet as IME for text input in a VST AR environment. Re-purposing existing devices is more sustainable than e.g., developing a new type of controller, and a purely software based solution will benefit a lot more users - because many users already own a tablet privately. Using a tablet instead of a smartphone as in the earlier examples promises better ergonomics, more possibilities for collaborative tasks (because the larger screen provides better readability for both the user and possible collaborators), and it allows using handwriting for text input instead of a tap- or swipe keyboard. Dedicated tablets also enable reusability across VST AR devices (e.g., Meta Quest 3 and Apple Vision Pro) and game engine (e.g., Unity and Unreal Engine). Future projects could also evaluate the usability of a tablet as pointing device, comparable to screen-less drawing tablets, e.g. for creative tasks such as drawing.

Using digital ink based handwriting recognition promises high usability, since users will be able to rely on motor skills they’ve generally internalized in their childhood. Focusing on just VST AR environments (instead of also considering VR environments) also removes the challenge of tracking a tablet and stylus, which has been a problem in previous similar studies [4].

Goal

This project will first analyze which measurements are suitable to evaluate the quality of such a tablet- and handwriting-based IME, and then develop a prototype which allows the collection of these measurements. This prototype will be based on the one used in Kern et al. - 2024 - Handwriting for Text Input and the Impact of XR Displays, Surface Alignments, and Sentence Complexities [5], and will expand especially on its limitations.

The measurements should cover four separate areas (see [4, 5, 6]), to make the prototype as adaptable as possible for future studies: Text input performance in general, to make comparisons with other IMEs possible, and handwriting text input performance in particular, to allow comparisons with other environments (e.g. VST AR vs VR). In both cases, the prototype should allow to evaluate product and process of the input - the final recognized text, as well as the steps it took to get there. A preliminary list of relevant measurements can be found in Table 1:

  Process Product
Text Input Total Error Rate (TER) Words per Minute (WPM),
Minimum String Distance Error Rate (MSD ER)
Handwriting Number of inversions in velocity (NIV) Dimensions, stroke count

Table 1: preliminary list of measurements and their categorization

Tasks

The prototype should use a server/client architecture and be developed with common web technologies, to make experimental setups as simple as possible, while also allowing easy integration with other applications. The client will be an easy to use website embedding the library MyScript [7] and collect the raw measurements needed to calculate the previously defined measurements. The server will process these measurements, calculate the measurements and save the data in a format suitable for later evaluations. The function of the prototype shall be evaluated in a pilot study.

Literature

  1. https://www.meta.com/de/quest/quest-3/
  2. https://www.apple.com/de/newsroom/2023/06/introducing-apple-vision-pro/
  3. https://support.google.com/chromecast/answer/11221499?hl=en
  4. Didehkhorshid, S. A. A., Philip, S., Samimi, E., & Teather, R. J. (2020). Text Input in Virtual Reality Using a Tracked Drawing Tablet. In C. Stephanidis, J. Y. C. Chen, & G. Fragomeni (Hrsg.), HCI International 2020 – Late Breaking Papers: Virtual and Augmented Reality (Bd. 12428, S. 314–329). Springer International Publishing. https://doi.org/10.1007/978-3-030-59990-4_24
  5. Kern, F., Tschanter, J., & Latoschik, M. E. (2024). Handwriting for Text Input and the Impact of XR Displays, Surface Alignments, and Sentence Complexities. IEEE Transactions on Visualization and Computer Graphics, 30(5), 2357–2367. https://doi.org/10.1109/TVCG.2024.3372124
  6. Gerth, S., Dolk, T., Klassert, A., Fliesser, M., Fischer, M. H., Nottbusch, G., & Festman, J. (2016). Adapting to the surface: A comparison of handwriting measures when writing on a tablet computer and on paper. Human Movement Science, 48, 62–73. https://doi.org/10.1016/j.humov.2016.04.006
  7. https://www.myscript.com/de/sdk

Contact Persons at the University Würzburg

Florian Kern (Primary Contact Person)
Human-Computer Interaction, Universität Würzburg
florian.kern@uni-wuerzburg.de

Prof. Dr. Marc Erich Latoschik
Human-Computer Interaction, Universität Würzburg
marc.latoschik@uni-wuerzburg.de

Legal Information