Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Game User Experience EvaluationEvaluating User Experience Factors using Experiments: Expressive Artificial Faces Embedded in Contexts

Game User Experience Evaluation: Evaluating User Experience Factors using Experiments: Expressive... [There is an ongoing debate on what kind of factors contribute to the general positive user experience (UX) while playing a game. The following chapter introduces an experimental setting to measure UX aroused by facial expression of embodied conversational agents (ECAs). The experimental setup enables to measure the implications of ECAs in three contextual settings called “still,” “animated,” and “interaction.” Within the experiment, artificially generated facial expressions are combined with emotion-eliciting situations and are presented via different presentation platforms. Stimuli (facial expressions/emotion-eliciting situations) are assembled in either consonant (for example, facial expression: “joy,” emotion-eliciting situation: “joy”) or dissonant (for example, facial expression: “joy,” emotion-eliciting situation: “anger”) constellations. The contextual setting called “interaction” is derived from the video games domain, granting an interactive experience of a given emotional situation. The aim of the study is to establish a comparative experimental framework to analyze subjects’ UX on emotional stimuli in different context dimensions. This comparative experimental framework utilizes theoretical models of emotion theory along with approaches from human–computer interaction to close a gap in the intersection of affective computing and research on facial expressions. Results showed that the interaction situation is rated as providing a better UX, independent of showing consonant or dissonant contextual descriptions. The “still” setting is given a higher UX rating than the “animated” setting.] http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png

Game User Experience EvaluationEvaluating User Experience Factors using Experiments: Expressive Artificial Faces Embedded in Contexts

Loading next page...
 
/lp/springer-journals/game-user-experience-evaluation-evaluating-user-experience-factors-fX3Jn68l5A
Publisher
Springer International Publishing
Copyright
© Springer International Publishing Switzerland 2015
ISBN
978-3-319-15984-3
Pages
113 –131
DOI
10.1007/978-3-319-15985-0_6
Publisher site
See Chapter on Publisher Site

Abstract

[There is an ongoing debate on what kind of factors contribute to the general positive user experience (UX) while playing a game. The following chapter introduces an experimental setting to measure UX aroused by facial expression of embodied conversational agents (ECAs). The experimental setup enables to measure the implications of ECAs in three contextual settings called “still,” “animated,” and “interaction.” Within the experiment, artificially generated facial expressions are combined with emotion-eliciting situations and are presented via different presentation platforms. Stimuli (facial expressions/emotion-eliciting situations) are assembled in either consonant (for example, facial expression: “joy,” emotion-eliciting situation: “joy”) or dissonant (for example, facial expression: “joy,” emotion-eliciting situation: “anger”) constellations. The contextual setting called “interaction” is derived from the video games domain, granting an interactive experience of a given emotional situation. The aim of the study is to establish a comparative experimental framework to analyze subjects’ UX on emotional stimuli in different context dimensions. This comparative experimental framework utilizes theoretical models of emotion theory along with approaches from human–computer interaction to close a gap in the intersection of affective computing and research on facial expressions. Results showed that the interaction situation is rated as providing a better UX, independent of showing consonant or dissonant contextual descriptions. The “still” setting is given a higher UX rating than the “animated” setting.]

Published: Jun 5, 2015

Keywords: Facial Expression; Basic Emotion; Emotional Facial Expression; Conversational Agent; Facial Action Code System

There are no references for this article.