We implemented different components to animate the face of virtual humans all derived from ``VHFaceAnimation'' which contains
# Getting Started
* Flags (``bUseHeadRotation`` and ``bUseEyeRotations``) that specify whether only blendshapes and jaw bone or also head and eye movements should be used (not provided in each component, e.g., not in OcukusLipSync)
* A ``Calibrate()`` method to always use a specific frame from a specific animation as neutral face or a ``bResetFaceCalibration`` flag, to specify a frame at time ``TimeForCalibration`` of the current animation to be used for calibration, i.e. substracted from every frame. This can be used when the animation sequence starts with a neutral face which is not detected as entirely neutral due to differences in the tracked faces.
* A ``LoadAnimationFile()`` method that loads an animation with the given implementation, and can also be used to preload animations, since they are cached and reused. Alternatively you can also specify an animation to use with ``AnimationName`` which is loaded, if set, on ``BeginPlay``.
* Provides general ``PlayFromTime(StartTime [in seconds])``, ``Play()``, ``Stop()`` and ``IsPlaying()``
* Also contains a method ``SaveAsAnimSequence(FString AnimName)`` which can be used to convert the animation data previously loaded with ``LoadAnimationFile()`` to an Unreal Animation Asset (.uasset) with name ``AnimName``
The Face Animation uses a pose asset that maps the different poses provided by, e.g., LiveLink to the blendshapes of the model. If you need to create a new pose asset, the names of the poses can be found in the ``PluginContent/Characters/FacePoseAssetsResources``. You need to create an animation with curve values for all needed blendshapes, where each frame of the animation is one pose (frame 0 being the first pose). :warning: Avoid bone tracks, and even remove those from the animation if present. Then you can click ``Create Asset -> Pose Asset`` and copy paste the pose names in there.
Before diving into the details of the implemented features, here is the step-by-step process for setting up face animation. In this example, LiveLink data will be applied to a MetaHuman, however, these steps transfer well to the other combinations that the CharacterPlugin supports.
A **demo map** can be found in [Unreal Character Test](https://devhub.vr.rwth-aachen.de/VR-Group/unreal-character-test), which can be found at Maps/FaceAnimation and implements key events for: 1 (VHLiveLinkFaceAnimation), 2 (VHOpenFaceAnimation), 3 (VHOculusLipSync) and 4 (Saving the LiveLinkFace Animation into a Unreal Animation Sequence Asset). See ``FaceAnimation/FaceAnimationInput`` blueprint for details
0.[Set up MetaHuman with CharacterPlugin](/Import-Data/MetaHuman-Model-Data)
1. First the appropriate component has to be added to the MetaHuman Blueprint, in this case `VHLiveLinkAnimation`:\
2. Then the correct mapping between the recorded data and the blendshapes of the model has to be selected, here `/CharacterPlugin/Characters/MetaHuman/LiveLinkFaceToMeta` (All paths are relative to the `Contents` directory) For more details on which mapping pose assets to use, see below:\
3. Now the Character is ready to receive face animation data. In our case, this is provided as a .csv file. For a first test, the path can simply be entered into the `Animation Filename` field in the "Details"-Tab. Of course, this property can also be set from the C++ code, to be able to play different animations at runtime.\
4. Finally, the animation has to be started. This can be done by calling the `Play()` function, again this can be done in C++. Here we are calling it by adding the `Play()`-node to the Event Graph. Now the animation should play, when the `Play()`-function is called at runtime.\

:warning: If you want to use .csv or .txt files in packaged builds (e.g., for the CAVE), make sure to add the paths in which those files are in you project settings to ``Additional Non-Asset Directories To Copy`` :warning:
At this point the first setup is complete, you can now delve into the details to customize the component to your needs. In the next section, the feature set will be described in more detail.
# Feature Set
We implemented different components to animate the face of virtual humans all derived from `VHFaceAnimation` (e.g.`VHLiveLinkAnimation`, `VHOculusLipSync`...) which contain
* Flags (`bUseHeadRotation` and `bUseEyeRotations`) that specify whether only blendshapes and jaw bone or also head and eye movements should be used (not provided in each component, e.g., not in OculusLipSync)
* A `Calibrate()` method to always use a specific frame from a specific animation as neutral face or a `bResetFaceCalibration` flag, to specify a frame at time `TimeForCalibration` of the current animation to be used for calibration, i.e. substracted from every frame. This can be used when the animation sequence starts with a neutral face which is not detected as entirely neutral due to differences in the tracked faces.
* A `LoadAnimationFile()` method that loads an animation with the given implementation, and can also be used to preload animations, since they are cached and reused. Alternatively you can also specify an animation to use with `AnimationName` which is loaded, if set, on `BeginPlay`.
* Provides general `PlayFromTime(StartTime [in seconds])`, `Play()`, `Stop()` and `IsPlaying()`
* Also contains a method `SaveAsAnimSequence(FString AnimName)` which can be used to convert the animation data previously loaded with `LoadAnimationFile()` to an Unreal Animation Asset (.uasset) with name `AnimName`
The Face Animation uses a pose asset that maps the different poses provided by, e.g. LiveLink, to the blendshapes of the model. If you need to create a new pose asset, the names of the poses can be found in the `PluginContent/Characters/FacePoseAssetsResources`. You need to create an animation with curve values for all needed blendshapes, where each frame of the animation is one pose (frame 0 being the first pose). :warning: Avoid bone tracks, and even remove those from the animation if present. Then you can click `Create Asset -> Pose Asset` and copy paste the pose names in there.
A **demo map** can be found in [Unreal Character Test](https://devhub.vr.rwth-aachen.de/VR-Group/unreal-character-test), which can be found at Maps/FaceAnimation and implements key events for: 1 (VHLiveLinkFaceAnimation), 2 (VHOpenFaceAnimation), 3 (VHOculusLipSync) and 4 (Saving the LiveLinkFace Animation into a Unreal Animation Sequence Asset). See `FaceAnimation/FaceAnimationInput` blueprint for details
:warning: If you want to use .csv or .txt files in packaged builds (e.g., for the CAVE), make sure to add the paths in which those files are in you project settings to `Additional Non-Asset Directories To Copy` :warning:
:speaking_head: A small explanation how to [generate TTS-speech and lipsync for it](TTS-lipsync) can be found [here](TTS-lipsync).
:speaking_head: A small explanation how to [generate TTS-speech and lipsync for it](TTS-lipsync) can be found [here](TTS-lipsync).
# Oculus Lip Sync (VHOculusLipSync) #
# Oculus Lip Sync (VHOculusLipSync)
Use Oculus Lip Sync to animate the face based on a audio file only.
Use Oculus Lip Sync to animate the face based on a audio file only.
First the audio file needs to be preprocessed, e.g., with the [OculusLipSyncWAVParser](https://devhub.vr.rwth-aachen.de/VR-Group/oculuslipsyncwavparser/-/tree/master). This generates a visemes.txt file which is then used as animation file.
First the audio file needs to be preprocessed, e.g., with the [OculusLipSyncWAVParser](https://devhub.vr.rwth-aachen.de/VR-Group/oculuslipsyncwavparser/-/tree/master). This generates a visemes.txt file which is then used as animation file.
For Character Creator 3 Models use the ``PluginContent/Characters/Henry/OculusLipSyncToCC3`` pose asset (which was created from the ``OculusLipSyncToCC3Anim`` animation sequence)
For Character Creator 3 Models use the `PluginContent/Characters/Henry/OculusLipSyncToCC3` pose asset (which was created from the `OculusLipSyncToCC3Anim` animation sequence)
# Live Link Face, based on iOS ARKit (VHLiveLinkFaceAnimation) #
# Live Link Face, based on iOS ARKit (VHLiveLinkFaceAnimation)
Track the face using LiveLinkFace (for more information see https://docs.unrealengine.com/en-US/AnimatingObjects/SkeletalMeshAnimation/FacialRecordingiPhone/index.html). However, we do not connect live to the iPhone but just use the saved .csv file.
Track the face using LiveLinkFace (for more information see https://docs.unrealengine.com/en-US/AnimatingObjects/SkeletalMeshAnimation/FacialRecordingiPhone/index.html). However, we do not connect live to the iPhone but just use the saved .csv file.
Copy that .csv file over somewhere in Content folder and load it.
Copy that .csv file over somewhere in Content folder and load it.
For Character Creator 3 Models use the ``PluginContent/Characters/Henry/LiveLinkFaceToCC3`` pose asset (which was created from the ``LiveLinkFaceToCC3Anim`` animation sequence)
For Character Creator 3 Models use the `PluginContent/Characters/Henry/LiveLinkFaceToCC3` pose asset (which was created from the `LiveLinkFaceToCC3Anim` animation sequence)
# Open Face (VHOpenFaceAnimation) #
# Open Face (VHOpenFaceAnimation)
It is also possible to track a face in a recorded video using openFace directly.
It is also possible to track a face in a recorded video using openFace directly.
Therefor:
For that:
1) Get OpenFace https://github.com/TadasBaltrusaitis/OpenFace/wiki/Windows-Installation
1) Get OpenFace https://github.com/TadasBaltrusaitis/OpenFace/wiki/Windows-Installation
@@ -41,11 +61,16 @@ Then parse a video using e.g. OpenFaceOffline.exe (for now only RecordAUs and Re
...
@@ -41,11 +61,16 @@ Then parse a video using e.g. OpenFaceOffline.exe (for now only RecordAUs and Re
Copy this csv file over and load it.
Copy this csv file over and load it.
:warning: As of now OpenFace is not tracking besides others: AU 22 (lip funneler) or AU 18 (lip pucker)
:warning: As of now OpenFace is not tracking besides others: AU 22 (lip funneler) or AU 18 (lip pucker) Filed an issue about that here: https://github.com/TadasBaltrusaitis/OpenFace/issues/936 Apparently it is not that easy, therefore we recommend to use LiveLinkFace for now.
Filed an issue about that here: https://github.com/TadasBaltrusaitis/OpenFace/issues/936
Apparently it is not that easy, therefore we recommend to use LiveLinkFace for now.
:warning: Also, as of writing this, this does not use a pose asset yet, so this should be eventually implemented.
:warning: Also, as of writing this, this does not use a pose asset yet, so this should be eventually implemented.