Update FaceAnimation authored by Jonathan Ehret's avatar Jonathan Ehret
We implemented different components to animate the face of virtual humans We implemented different components to animate the face of virtual humans all derived from ``VHFaceAnimation'' which
* also contains a method ``SaveAsAnimSequence(FString AnimName)`` which can be used to convert the animation data given in .csv, .txt etc. to an Unreal Animation Asset (.uasset)
* Flags (``bUseHeadRotation`` and ``bUseEyeRotations``) that specify whether only blendshapes and jaw bone or also head and eye movements should be used (not provided in each component, e.g., not in OcukusLipSync)
* A ``bResetFaceCalibration`` flag, to specify whether the frame at time ``TimeForCalibration`` will be used for calibration, i.e. substracted from every frame. This can be used when the animation sequence starts with a neutral face which is not detected as entirely neutral due to differences in the tracked faces.
* Provides general ``PlayFromTime(StartTime [in seconds])``, ``Play()``, ``Stop()`` and ``IsPlaying()``
# Oculus Lip Sync (VHOculusLipSync) # # Oculus Lip Sync (VHOculusLipSync) #
Use Oculus Lip Sync to animate the face based on a audio file only.
First the audio file needs to be preprocessed, e.g., with the [OculusLipSyncWAVParser](https://devhub.vr.rwth-aachen.de/VR-Group/oculuslipsyncwavparser/). This generates a .txt file which can be copied into the project and loaded with ``ReadInVisemesFromTxt()``, this also loads in a mapping of the Oculus Visemes to the Character Creator 3 Blendshapes, which can be specified in ``MorphTargetMappingFile = "LipSync/CC3OculusLipSyncMapping.csv"`` and is shipped with the plugin (so normally leave that as is). If you want to change that or use another character setup, take a look at ``ULipSyncDataHandlingHelper`` how to create it.
# Live Link Face, based on iOS ARKit (VHLiveLinkFaceAnimation) # # Live Link Face, based on iOS ARKit (VHLiveLinkFaceAnimation) #
Track the face using LiveLinkFace (for more information see https://docs.unrealengine.com/en-US/AnimatingObjects/SkeletalMeshAnimation/FacialRecordingiPhone/index.html). However, we do not connect live to the iPhone but just use the saved .csv file. Track the face using LiveLinkFace (for more information see https://docs.unrealengine.com/en-US/AnimatingObjects/SkeletalMeshAnimation/FacialRecordingiPhone/index.html). However, we do not connect live to the iPhone but just use the saved .csv file.
...@@ -23,3 +31,6 @@ Copy this csv file over and load it with ``VHOpenFaceAnimation::LoadOpenFaceCSVF ...@@ -23,3 +31,6 @@ Copy this csv file over and load it with ``VHOpenFaceAnimation::LoadOpenFaceCSVF
:warning: As of now OpenFace is not tracking besides others: AU 22 (lip funneler) or AU 18 (lip pucker) :warning: As of now OpenFace is not tracking besides others: AU 22 (lip funneler) or AU 18 (lip pucker)
Filed an issue about that here: https://github.com/TadasBaltrusaitis/OpenFace/issues/936 Filed an issue about that here: https://github.com/TadasBaltrusaitis/OpenFace/issues/936
Apparently it is not that easy, therefore we recommend to use LiveLinkFace for now. Apparently it is not that easy, therefore we recommend to use LiveLinkFace for now.
|:arrow_left: [Go back to main page](home)|
|--------------:|
\ No newline at end of file