Facial and Bodily Expressions for Control and Adaptation of Games

Facial and Bodily Expressions for Control and Adaptation of Games (ECAG’11) Santa Barbara, California, March 24 or 25 (TBD), 2011...

Facial and Bodily Expressions for Control and Adaptation of Games (ECAG’11)

Santa Barbara, California, March 24 or 25 (TBD), 2011

http://hmi.ewi.utwente.nl/ecag11

Workshop organized at the Ninth IEEE International Conference on Automatic Face and Gesture Recognition (FG 2011)

Workshop Description

—————————

Expressivity in the human body and face can serve to control or adapt the interaction with a system. Examples are Xbox Kinect that uses body movements to control game characters, gesture interaction with a robot in a home environment, or adapting teaching strategy in a tutoring application based on detected frustration or boredom. In these examples, observations of the face and body are used in different forms, depending on whether the user has the initiative to control the interaction or whether the application takes the initiative to adapt itself to the user. Hence, we look at:

Control: The user consciously produces facial expressions, head movements or body gestures to control a game. This includes commands that allow navigation in the game environment or that allow movements of game characters or changes in their appearances (e.g. showing similar facial expressions on the character’s face, transforming body gestures to emotion-related or to emotion-guided activities).

Adaptation: The gamer’s spontaneous facial expressions and body poses are interpreted and used to adapt the game to the supposed affective state of the gamer. This adaptation can affect the appearance of the game environment, the interaction modalities, the experience and engagement, the narrative and the strategy that is followed by the game or the game actors.

We are soliciting papers that discuss research into this area, with a focus on applications. We consider the domain of entertainment, robot control, and (serious) gaming and simulation. In addition to video-based observation, we also consider other means of input, including multi-modal approaches. Technical papers, as well as survey papers and empirical papers are eligible. Authors are invited to submit papers, with the page limits and formatting guidelines of the main conference (see www.fg2011.org). Papers will be refereed by three reviewers. Submissions will be handled by a conference management system. More information will be available shortly.

All FG Workshop papers will be archived by IEEE Xplore. They will also be included in the DVD conference proceedings.

Important Dates

——————–

Deadline for submission: December 14, 2010

Notification of acceptance: January 12, 2011

Final versions due: January 19, 2011

Conference: March 21-23, 2011

Workshops: March 24 or 25 (TBD), 2011

Workshop Organizers

————————–

Anton Nijholt, University of Twente, the Netherlands

Ronald Poppe, University of Twente, the Netherlands

About admin

Lindsay Grace is a teacher, software developer and designer. He has served industry as an independent consultant, web designer, software developer, entrepreneur, business analyst and writer. Lindsay has a joint position between Miami University’s Armstrong Institute for Interactive Media Studies and the School of Fine Arts. His research areas include human-computer interaction, creative and critical gameplay, and web design. He writes regularly about interactive media design and education.