This component allows applying particular facial expressions to a character.
This component uses [Facial Activation Encoding System](https://imotions.com/blog/facial-action-coding-system/)(FACS) as a notation system for possible facial movements - activation units (AUs). Each AU has a value that characterizes to which extent the motion should be performed, akin to blendshapes/morph targets.
The facial expressions, or poses, are saved in `FACSExpressionsLibrary`, which maps an emotion of an enum type `Emotions` to an array of FACS values `FACSValue`.
`FACSValue` itself is a structure which contains:
- an integer value denoting the AU number
- the value associated with the AU
-`FACSSidedness` Side, which represents the side of the AU on the face (left, right or both)
There are 8 basic emotions stored in `FACSExpressionsLibrary`:
- Neutral
- Happiness
- Sadness
- Surprise
- Fear
- Anger
- Disgust
- Contempt
Additionally, there is also a `Custom` emotion, which allows to manually override and add values to some specified AUs.
To pick an emotion, set `SeletedEmotion` to appropriate emotion. This can be done in both blueprint and c++. Additionally, there are functions `SetFACSActionUnit`, which uses Custom emotion to change a value of a given AU, and `VHSetFACSValues` which adds an array of these AU values at once.
Important: this component requires a mapping `PoseAsset Mapping` for according character model type: `CC3` or `MetaHuman` (for these two there are mappings provided: ``FacialExpressionsMetaPoseAsset`` and ``FacialExpressionsCC3PoseAsset``).
\ No newline at end of file
This component allows applying particular facial expressions to a character.
This component uses [Facial Activation Encoding System](https://imotions.com/blog/facial-action-coding-system/)(FACS) as a notation system for possible facial movements - activation units (AUs). Each AU has a value that characterizes to which extent the motion should be performed, akin to blendshapes/morph targets.
The facial expressions, or poses, are saved in `FACSExpressionsLibrary`, which maps an emotion of an enum type `Emotions` to an array of FACS values `FACSValue`.
`FACSValue` itself is a structure which contains:
- an integer value denoting the AU number
- the value associated with the AU
-`FACSSidedness` Side, which represents the side of the AU on the face (left, right or both)
There are 8 basic emotions stored in `FACSExpressionsLibrary`:
- Neutral
- Happiness
- Sadness
- Surprise
- Fear
- Anger
- Disgust
- Contempt
Additionally, there is also a `Custom` emotion, which allows to manually override and add values to some specified AUs.
To pick an emotion, set `SeletedEmotion` to appropriate emotion. This can be done in both blueprint and c++. Additionally, there are functions `SetFACSActionUnit`, which uses Custom emotion to change a value of a given AU, and `VHSetFACSValues` which adds an array of these AU values at once.
Important: this component requires a mapping `PoseAsset Mapping` for according character model type: `CC3` or `MetaHuman` (for these two there are mappings provided: ``FacialExpressionsMetaPoseAsset`` and ``FacialExpressionsCC3PoseAsset``).
# CC4 Characters and Metahumans
Since CC4, Reallusion has updated its export pipeline to UE5 to better accommodate Metahuman's design philosophy when talking about their animation pipelines in UE5, especially the curve-driven facial expressions.
Facial expressions are driven by setting morph targets and bone transforms. The morph targets are pure vertex transform snapshots, defined by the artist in DCC, used for controls like eyebrow up and down, mouth corners movements, or details like wrinkles; The bone transforms are usually used for jaw and eyeball movements.
MetaHuman provides a cinematic and scalable approach to facial control. It defines a set of high-level, standardized controls (e.g., mouth stretch), then maps these controls to Animation Curves, which, in turn, drive a massive library of granular, pre-sculpted morph targets. This complex curve-to-expression mapping and inference is executed within the Control Rig. In this way, MetaHuman provides artists with intuitive, high-level control that automatically generates detailed facial expressions, using its library of morph targets without requiring the user to sculpt them individually.