Virtual Characters Get Facial Expressions

One of the most difficult things in animation is to simulate human facial expressions in virtual characters. Usually these virtual characters mimic human behavior through programmed commands or scripts which tends to give them a rather robotic feel. The spontaneous nature of human expression in ongoing communication is lost in this method.

Science experimenters at the Autonomous University of the State of Mexico (UAEM) were able to generate expressions and emotions based on real people. This was done by taking recordings of sensors placed on 43 muscles involved in facial behavior on real people’s faces. These were then recorded by 3d cameras and translated into numerical data which could be used in programming languages.

A large number of emotions, intentions, attitudes and moods that vary depending on the social context were recorded for the study. The data collected was then used to generate facial expressions in virtual characters.Unlike video games and consoles, the data collected by this study is not for use by play software but for educational purposes.

The researchers are targeting different educational, scientific or civil strategies. They call it the “Serious Game”.  They want the findings of the study to be used for actually beneficial purposes for humanity. This science experiment is not being frivolous at all.

 

Leave a Comment

You must be logged in to post a comment.