BodySpeech: A configurable facial and gesture animationsystem for speaking avatars
View/Open
Author
Other authors
Publication date
2013-07Abstract
Speaking avatars are present in many HumanComputer Interaction (HCI) applications. Their importancelies in communicative goals which entail interaction withinother avatars in virtual worlds or in marketing where theyhave become useful in customer push strategies. Generatingautomatic and plausible animations from speech cues havebecome a challenge. We present BodySpeech, an automaticsystem to generate gesture and facial animations driven byspeech. Body gestures are aligned with pitch accents andselected based on the strength relation between speech andbody gestures. Concurrently, facial animation is generatedfor lip sync, adding emphatic hints according to intonationstrength. Furthermore, we have implemented a tool foranimators. This tool enables us to modify the detection ofpitch accents and the intonation strength influence on outputanimations, allowing animators to define the activation ofgestural performances.
Document Type
Object of conference
Language
English
Subject (CDU)
62 - Engineering. Technology in general
Keywords
Interacció persona-robot
Animació per ordinador
Pages
7 p.
Publisher
International Conference on Computer Graphics and Virtual Reality, Las Vegas, 22-25 of July, 2013
This item appears in the following Collection(s)
Rights
© L'autor/a. Tots el drets reservats