We implemented different components to animate the face of virtual humans
# Oculus Lip Sync (VHOculusLipSync) #
# Live Link Face, based on iOS ARKit (VHLiveLinkFaceAnimation) #
Track the face using LiveLinkFace (for more information see https://docs.unrealengine.com/en-US/AnimatingObjects/SkeletalMeshAnimation/FacialRecordingiPhone/index.html). However, we do not connect live to the iPhone but just use the saved .csv file.
Copy that .csv file over somewhere in Content folder and load it using ``UVHLiveLinkAnimation::LoadLiveLinkCSVFile()``
# Open Face (VHOpenFaceAnimation) #
For now it is easiest to track a face in a recorded video using openFace directly
Therefor:
1) Get OpenFace https://github.com/TadasBaltrusaitis/OpenFace/wiki/Windows-Installation