Human-Computer Interaction

Development and Evaluation of a Personality-Enhanced Virtual Audience.


This project is already assigned.

Introduction

Virtual Reality systems have been used to train individuals and to treat anxieties for many years now. One of these anxieties treatment and training scenarios is public speaking. The University of Würzburg created a virtual audience system that could help treat speaking anxieties and to help people train their speaking skills. In its current state, there is only a limited number of animations and thus behaviors the virtual agents can display. Consequently, the lack of heterogeneity between virtual agent animations is reducing the overall realism of the simulations and, hence, the feeling of “being present” in front of an audience.

For treatment and training to be effective, a certain amount of presence is required in the virtual space. More specifically, social presence of the virtual agents is required in this context. We theorize that by increasing the heterogeneity of the virtual audiences animations, we can further increase the realism and social presence of the virtual audience system. The main question we seek to answer is: ”How can we increase and evaluate the perceived heterogeneity of a virtual audience without creating many different animations“. The approach we propose is to modify existing animations at run-time based on a personality model given to virtual agents. Our system will adapt the animation’s speed, amplitude, agent’s posture as well as other parameters according to the virtual agent’s personality traits.”

Previous Work

The way a person behaves in their everyday life is influenced by their personality. This influence can also be seen in how a person moves [1]. We propose to solve the heterogeneity issue of the virtual audience by incorporating a system to dynamically alter animations depending on personality parameters assigned to the individual agents.

In psychology, the most widely employed model of personality is the big five personality traits model. Previous research has already linked particular behaviors and movement patterns to different traits of the big five model [1, 3, 5]. However, most of the findings seem to only clearly correspond to two of the five dimensions, extravertedness and emotional stability. The study by Smith and Neff has also proposed to simplify the big five model into two categories for movements. These are plasticity, combining extravertedness and openness, and stability, combining emotional stability, conscientiousness and agreeableness [5]. We will therefore focus on creating variations based on the extravertedness and emotional stability traits.

Planned Methodology and Concepts

We will first summarize the findings from previous studies on how personality affects movements. Using these findings, we will examine the existing animations of the virtual audience system and categorize them accordingly.

The personality system we aim to create will also have to work in conjunction with the attitude model already implemented in the virtual audience system. Therefore, we need to alter the already existing and used animations to reflect the personality the agents should have while making sure their attitudes towards the speaker are still understood.

One possible implementation could be to use Unreal Engine’s Control Rig feature to alter the animations at run-time. Another possibility would be to take the existing animations and create two variations of each representing the extremes of one of the big five’s traits and then blend between these extremes using a Blendspace. As the low end of the emotional stability trait is associated with self-adaptors [2], it would also be possible to, for example, create animations of the characters doing little scratches or touching their face that could be blended into already existing animations.

Previous studies have successfully employed a big five personality-based questionnaire to verify their model of personality-driven animations [2, 3]. Based on the process used by these studies, the evaluation will be conducted as an online survey. A total of five videos of isolated agents will be shown to participants, each showing a different, predefined configuration of the personality parameters. The configurations would be:

The agents will behave the same way they would during the audience simulation, showing different animations. Participants will then fill out a big five personality test for this character and describe what kind of personality they think this character has. After viewing the videos of the individual agents, participants will be shown a video of the audience as a whole. Instead of filling out a survey focusing on one particular agent, participants will instead be asked if they felt that agents had their own way of moving and if they felt the agents had different personalities.

This could be done in different stages. At first, an audience could be shown where all agents display the same behavior. Then, the number of agents showing the same behavior could be lowered until all agents show different behaviors. This will allow us to see if the introduction of personality has any impact on how the audience is perceived. In addition, it would also show if it would be necessary to give each agent a personality or if it would suffice to give only a a certain percentage of agents a personality.

Once the personalities have been verified, their impact on the social presence of the virtual agents will to measured in an additional study. This could be done by using the Co-presence and Social Presence Scale was developed by Poeschl and Doering [4]. This scale was specifically developed for measuring the social and co-presence of virtual audiences for speech anxiety training, and thus would be a perfect fit.

The study will be designed as a between-subjects design. One group will be presented with a virtual audience not showing any personality-driven animations, and the second group will be shown an audience using our personality-enhanced system. Participants will be asked to hold a short speech about a topic they are familiar with in front of the virtual audience. Afterwards, participants will fill out a questionnaire consisting of the items on the Co-Presence and Social Presence scale.

One of the most common strategies to treat anxieties is exposure therapy. As part of exposure therapy is to induce a form of emotional stress in the participant, we think that measuring the emotional stress response of the participants could be used in addition to the presence scale to measure the effectiveness of a personality-enhanced system.

Time-Schedule

Planned time-schedule.

REFERENCES

[1] Jensen, M. (2016). Personality traits and nonverbal communication patterns. International Journal of Social Science Studies, 4. https://doi.org/10.11114/ijsss.v4i5.1451 (cit. on p. ) [2] Neff, M., Toothman, N., Bowmani, R., Tree, J. E. F., & Walker, M. A. (2011). Don’t scratch! self-adaptors reflect emotional stability. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 6895 LNAI, 398–411. https://doi.org/10.1007/978- 3- 642- 23974- 8 43 (cit. on p. ) [3] Neff, M., Wang, Y., Abbott, R., & Walker, M. (2010). Evaluating the effect of gesture and language on personality perception in conversational agents. (Cit. on p. ). [4] Poeschl, S., & Doering, N. (2015). Measuring co-presence and social presence in virtual environments–psychometric construction of a german scale for a fear of public speaking scenario. (Cit. on p. ). [5] Smith, H. J., & Neff, M. (2017). Understanding the impact of animated gesture performance on personality perceptions. ACM Transactions on Graphics, 36. https://doi.org/10. 1145/3072959.3073697 (cit. on p. )


Contact Persons at the University Würzburg

Dr. Jean-Luc Lugrin (Primary Contact Person)
Human-Computer Interaction, Universität Würzburg
jean-luc.lugrin@uni-wuerzburg.de

Legal Information