BodySpeech: A configurable facial and gesture animationsystem for speaking avatars
Visualitza/Obre
Autor/a
Fernández Baena, Adso
Antonijoan Tresens, Marc
Montaño Aparicio, Raúl
Fusté Lleixà, Anna
Amores Fernandez, Judith
Altres autors/es
Universitat Ramon Llull. La Salle
Data de publicació
2013-07Resum
Speaking avatars are present in many HumanComputer Interaction (HCI) applications. Their importancelies in communicative goals which entail interaction withinother avatars in virtual worlds or in marketing where theyhave become useful in customer push strategies. Generatingautomatic and plausible animations from speech cues havebecome a challenge. We present BodySpeech, an automaticsystem to generate gesture and facial animations driven byspeech. Body gestures are aligned with pitch accents andselected based on the strength relation between speech andbody gestures. Concurrently, facial animation is generatedfor lip sync, adding emphatic hints according to intonationstrength. Furthermore, we have implemented a tool foranimators. This tool enables us to modify the detection ofpitch accents and the intonation strength influence on outputanimations, allowing animators to define the activation ofgestural performances.
Tipus de document
Objecte de conferència
Llengua
English
Matèries (CDU)
62 - Enginyeria. Tecnologia
Paraules clau
Interacció persona-robot
Animació per ordinador
Pàgines
7 p.
Publicat per
International Conference on Computer Graphics and Virtual Reality, Las Vegas, 22-25 of July, 2013
Aquest element apareix en la col·lecció o col·leccions següent(s)
Drets
© L'autor/a. Tots el drets reservats