Getting Started
Before diving into the details of the implemented features, here is the step-by-step process for setting up face animation. In this example, LiveLink data will be applied to a MetaHuman, however, these steps transfer well to the other combinations that the CharacterPlugin supports.
- Set up MetaHuman with CharacterPlugin
- First the appropriate component has to be added to the MetaHuman Blueprint, in this case
VHLiveLinkAnimation
:
- Then the correct mapping between the recorded data and the blendshapes of the model has to be selected, here
/CharacterPlugin/Characters/MetaHuman/LiveLinkFaceToMeta
(All paths are relative to theContents
directory) For more details on which mapping pose assets to use, see below:
- Now the Character is ready to receive face animation data. In our case, this is provided as a .csv file. For a first test, the path can simply be entered into the
Animation Filename
field in the "Details"-Tab. Of course, this property can also be set from the C++ code, to be able to play different animations at runtime.
- Finally, the animation has to be started. This can be done by calling the
Play()
function, again this can be done in C++. Here we are calling it by adding thePlay()
-node to the Event Graph. Now the animation should play, when thePlay()
-function is called at runtime.
At this point the first setup is complete, you can now delve into the details to customize the component to your needs. In the next section, the feature set will be described in more detail.
Feature Set
We implemented different components to animate the face of virtual humans all derived from VHFaceAnimation
(e.g.VHLiveLinkAnimation
, VHOculusLipSync
...) which contain
- Flags (
bUseHeadRotation
andbUseEyeRotations
) that specify whether only blendshapes and jaw bone or also head and eye movements should be used (not provided in each component, e.g., not in OculusLipSync) - A
Calibrate()
method to always use a specific frame from a specific animation as neutral face or abResetFaceCalibration
flag, to specify a frame at timeTimeForCalibration
of the current animation to be used for calibration, i.e. substracted from every frame. This can be used when the animation sequence starts with a neutral face which is not detected as entirely neutral due to differences in the tracked faces. - A
LoadAnimationFile()
method that loads an animation with the given implementation, and can also be used to preload animations, since they are cached and reused. Alternatively you can also specify an animation to use withAnimationName
which is loaded, if set, onBeginPlay
. - Provides general
PlayFromTime(StartTime [in seconds])
,Play()
,Stop()
andIsPlaying()
- Also contains a method
SaveAsAnimSequence(FString AnimName)
which can be used to convert the animation data previously loaded withLoadAnimationFile()
to an Unreal Animation Asset (.uasset) with nameAnimName
The Face Animation uses a pose asset that maps the different poses provided by, e.g. LiveLink, to the blendshapes of the model. If you need to create a new pose asset, the names of the poses can be found in the PluginContent/Characters/FacePoseAssetsResources
. You need to create an animation with curve values for all needed blendshapes, where each frame of the animation is one pose (frame 0 being the first pose). Create Asset -> Pose Asset
and copy paste the pose names in there.
A demo map can be found in Unreal Character Test, which can be found at Maps/FaceAnimation and implements key events for: 1 (VHLiveLinkFaceAnimation), 2 (VHOpenFaceAnimation), 3 (VHOculusLipSync) and 4 (Saving the LiveLinkFace Animation into a Unreal Animation Sequence Asset). See FaceAnimation/FaceAnimationInput
blueprint for details
Additional Non-Asset Directories To Copy
Oculus Lip Sync (VHOculusLipSync)
Use Oculus Lip Sync to animate the face based on a audio file only.
First the audio file needs to be preprocessed, e.g., with the OculusLipSyncWAVParser. This generates a visemes.txt file which is then used as animation file.
For Character Creator 3 Models use the PluginContent/Characters/Henry/OculusLipSyncToCC3
pose asset (which was created from the OculusLipSyncToCC3Anim
animation sequence)
Live Link Face, based on iOS ARKit (VHLiveLinkFaceAnimation)
Track the face using LiveLinkFace (for more information see https://docs.unrealengine.com/en-US/AnimatingObjects/SkeletalMeshAnimation/FacialRecordingiPhone/index.html). However, we do not connect live to the iPhone but just use the saved .csv file. A script how to cut those files and best practices when recording can be found here: LiveLinkFaceCuttingScript
Copy that .csv file over somewhere in Content folder and load it.
For Character Creator 3 Models use the PluginContent/Characters/Henry/LiveLinkFaceToCC3
pose asset (which was created from the LiveLinkFaceToCC3Anim
animation sequence)
Open Face (VHOpenFaceAnimation)
It is also possible to track a face in a recorded video using openFace directly.
For that:
- Get OpenFace https://github.com/TadasBaltrusaitis/OpenFace/wiki/Windows-Installation
- Download Models: https://github.com/TadasBaltrusaitis/OpenFace/wiki/Model-download
Then parse a video using e.g. OpenFaceOffline.exe (for now only RecordAUs and RecordPose are supported)
Copy this csv file over and load it.
Unreal 5.5 Audio to Face Tutorial
A video tutorial for setting up Audio to Face in Unreal 5.5.
- Install Metahuman plugin from fab.com.
2. Enable the plugin in Unreal Editor.
3. Create Metahuman Performance asset.
- InputType to Audio
- Audio asset to the audio you want to convert
- Control Rig to Face_ControlBoard_CtrlRig under Metahuman template folder
- Visualization Mesh to face skeletal mesh you want to preview
7. In export settings, select TargetSkeleton to the Face Metahuman skeleton under CharacterPlugin content folder.
8. Create Montage asset from the exported animation sequence.
9. Play the montage by calling PlayFacialAnimationMontage in VirtualHuman.
Unreals 5.5 Audio to Face Feature (Workaround)
|
---|

To use Audio to Face Animations together with Character Plugin features such as gazing, follow these steps:
- Duplicate the Meta_Anim_BP (Not the one for the face)
- Add a
Layered blend per bone
and connect it as shown on the image - Set the
Blend Mode
set toBranch Filter
and theBone Name
=FACIAL_C_NECK1Root
(see image on the right) - Use your animation as a
CustomIdleAnimation
, note thatCustomIdleAnimations
is a array from which an animation is randomly chosen. For this use case, make sure to only have one animation in the array at a time - It might be helpful to use
AVirtualHuman::ForcePlayNewIdleAnimation
(also callable from Blueprints) when you change theCustomIdleAnimations