@@ -5,7 +5,7 @@ We implemented different components to animate the face of virtual humans all de
* A ``LoadAnimationFile()`` method that loads an animation with the given implementation, and can also be used to preload animations, since they are cached and reused. Alternatively you can also specify an animation to use with ``AnimationName`` which is loaded, if set, on ``BeginPlay``.
* Provides general ``PlayFromTime(StartTime [in seconds])``, ``Play()``, ``Stop()`` and ``IsPlaying()``
The Face Animation uses a pose asset that maps the different poses provided by, e.g., LiveLink to the blendshapes of the model (:warning: except for VHOpenFaceAnimation which still uses the hard coded FACS definitions from VHFacialExpressions, which should be moved to a pose asset as well!:warning:). If you need to create a new pose asset, the names of the poses can be found in the ``PluginContent/Characters/FacePoseAssetsResources``. You need to create an animation with curve values for all needed blendshapes, where each frame of the animation is one pose (frame 0 being the first pose). :warning: Avoid bone tracks, and even remove those from the animation if present. Then you can click Create Asset - Pose asset and copy paste the pose names in there.
The Face Animation uses a pose asset that maps the different poses provided by, e.g., LiveLink to the blendshapes of the model (:warning: except for VHOpenFaceAnimation which still uses the hard coded FACS definitions from VHFacialExpressions, which should be moved to a pose asset as well!:warning:). If you need to create a new pose asset, the names of the poses can be found in the ``PluginContent/Characters/FacePoseAssetsResources``. You need to create an animation with curve values for all needed blendshapes, where each frame of the animation is one pose (frame 0 being the first pose). :warning: Avoid bone tracks, and even remove those from the animation if present. Then you can click ``Create Asset -> Pose Asset`` and copy paste the pose names in there.
A **demo map** can be found in [Unreal Character Test](https://devhub.vr.rwth-aachen.de/VR-Group/unreal-character-test), which can be found at Maps/FaceAnimation and implements key events for: 1 (VHLiveLinkFaceAnimation), 2 (VHOpenFaceAnimation), 3 (VHOculusLipSync) and 4 (Saving the LiveLinkFace Animation into a Unreal Animation Sequence Asset). See ``FaceAnimation/FaceAnimationInput`` blueprint for details
...
...
@@ -40,6 +40,7 @@ Copy this csv file over and load it.
:warning: As of now OpenFace is not tracking besides others: AU 22 (lip funneler) or AU 18 (lip pucker)
Filed an issue about that here: https://github.com/TadasBaltrusaitis/OpenFace/issues/936
Apparently it is not that easy, therefore we recommend to use LiveLinkFace for now.
:warning: Also, as of writing this, this does not use a pose asset yet, so this should be eventually implemented.