This tool is intended for those who need virtual characters with valid and reliable facial emotion expression. The model does not use animations, but uses direct bone control to express and blend dynamic expressions. The expressions have been evaluated in (Broekens, Chao Qu, & Brinkman, 2012). The tool contains the following items:
1. The experiment standalone executable for testing and checking out the expressions that can be generated with the software and models. Use num keys 1-6 and keys q-y to control expressions.
2. An example model (FaceGen, 3DS and Vizard format) with executable code instrumented to be able to express emotions on the face, according to the method described in (Broekens, et al., 2012). Use num keys 1-6 and keys q-y to control expressions.
3. The technical report (please refer when you use the tool, or parts of it, its method, data, etc.)
4. The experiment python code (vizard development kit)
a. A class to control facial expressions
b. A class to manage an emotional state.
c. The experiment code used to test the expressions
All code needs Vizard 3D http://www.worldviz.com/products/vizard)
Usage of the emotional state and expression code is pretty straightforward, and setting up an avatar with emotions is a question of a couple of lines of code. Please see the class AffectManager in emotion.py, and the example code in the expressionexperiment2.py in the class Character method _init. There you will find how to control the face with keys, as well as how to setup a character with emotions.
Please note that a new VC model needs to be "rigged", i.e., FACS-based muscle attachment needs to be done manually for every new facial morphology. See the example directory for a FaceGen, 3DS and Vizard model example. An overview of the process of creating a new face with a different morphology can be found in the Facial emotion expression overview.pptx file in the example directory. See the paper for more information. TIP: changing the texture of the head will already create a different look.
If you have any questions on how to rig the 3d model with FACS muscle attachements prior to using it for expression, please contact joost.broekens@gmail.com
Download the full package here
License of use
This system is distributed under the Creative Commons License CC-BY-NC-SA, unless otherwise agreed in writing.
In short this means that you can use it in your own research or educational project, that you may alter it but not state that alterations are endorse by me, and that you must always credit the inventors. We prefer you credit us by citing our most recent work:
Dynamic Facial Expression of Emotion Made Easy
Joost Broekens, Chao Qu, Willem-Paul Brinkman (2012). Technical report. Interactive Intelligence, Delft University of Technology.
For commercial use you can contact me at my email above.
Contact information / Top