Update Components/FaceAnimation authored by Jonathan Ehret's avatar Jonathan Ehret
We implemented different components to animate the face of virtual humans
# Oculus Lip Sync (VHOculusLipSync) #
# Live Link Face, based on iOS ARKit (VHLiveLinkFaceAnimation) #
Track the face using LiveLinkFace (for more information see https://docs.unrealengine.com/en-US/AnimatingObjects/SkeletalMeshAnimation/FacialRecordingiPhone/index.html). However, we do not connect live to the iPhone but just use the saved .csv file.
Copy that .csv file over somewhere in Content folder and load it using ``UVHLiveLinkAnimation::LoadLiveLinkCSVFile()``
# Open Face (VHOpenFaceAnimation) #
For now it is easiest to track a face in a recorded video using openFace directly
Therefor:
1) Get OpenFace https://github.com/TadasBaltrusaitis/OpenFace/wiki/Windows-Installation
2) Download Models: https://github.com/TadasBaltrusaitis/OpenFace/wiki/Model-download
Then parse a video using e.g. OpenFaceOffline.exe (for now only RecordAUs and RecordPose are supported)
Copy this csv file over and load it with ``VHOpenFaceAnimation::LoadOpenFaceCSVFile()``
:warning: As of now OpenFace is not tracking besides others: AU 22 (lip funneler) or AU 18 (lip pucker)
Filed an issue about that here: https://github.com/TadasBaltrusaitis/OpenFace/issues/936
Apparently it is not that easy, therefore we recommend to use LiveLinkFace for now.
\ No newline at end of file