@@ -4,7 +4,7 @@ We implemented different components to animate the face of virtual humans all de
...
@@ -4,7 +4,7 @@ We implemented different components to animate the face of virtual humans all de
* A ``bResetFaceCalibration`` flag, to specify whether the frame at time ``TimeForCalibration`` will be used for calibration, i.e. substracted from every frame. This can be used when the animation sequence starts with a neutral face which is not detected as entirely neutral due to differences in the tracked faces.
* A ``bResetFaceCalibration`` flag, to specify whether the frame at time ``TimeForCalibration`` will be used for calibration, i.e. substracted from every frame. This can be used when the animation sequence starts with a neutral face which is not detected as entirely neutral due to differences in the tracked faces.
* Provides general ``PlayFromTime(StartTime [in seconds])``, ``Play()``, ``Stop()`` and ``IsPlaying()``
* Provides general ``PlayFromTime(StartTime [in seconds])``, ``Play()``, ``Stop()`` and ``IsPlaying()``
A **demo map** can be found in [Unreal Character Test](https://devhub.vr.rwth-aachen.de/VR-Group/unreal-character-test), which is called
A **demo map** can be found in [Unreal Character Test](https://devhub.vr.rwth-aachen.de/VR-Group/unreal-character-test), which can be found at Maps/FaceAnimation and implements key events for: 1 (VHLiveLinkFaceAnimation), 2 (VHOpenFaceAnimation), 3 (VHOculusLipSync) and 4 (Saving the LiveLinkFace Animation into a Unreal Animation Sequence Asset). See FaceAnimation/FaceAnimationInput for details