With the MoCap plugin, you can capture human body and optionally finger movements and translate them into animation sequences. The plugin is designed for Unreal Engine 4.27. Aside from a Vive and two Index or Vive Controllers, you need six Vive Trackers attachable with straps for the intended use. For the plugin to work, you also need to install the Character Plugin and all its dependencies into the project.
If you search for a plugin with build in supported avatar and calibration functionality with reduced input sensors, you can look at the Avatar Plugin. Though with a little work, the MocapPlugin plugin can also support avatar functionality (for an example, see the ASPawn (CalActor for automatic height calibration) in Audictive Study).
There are two projects which were used to utilize this plugin:
- MoCap for Audictive Study using the MoCap plugin to record multiple movements for the HTR texts (used in the audictive project) and fine tune them.
- Gesture Manipulation for Audictive developed in a master thesis by Jonas Schüppen to improve and disprove recorded gestures.
In the following you will find a detailed step by step guide for using the Motion Capture Plugin.
Measuring Yourself
For greater accuracy, you have the option to measure your body. The measurements will be considered while recording and when translating the recording data to an Animation Sequence. Copy the file "Measurements.txt" located in the plugin content folder and include your own measurements. The picture below shows the exact measurement locations, where the letters match the letters in the table and in "Measurements.txt".
Letter | Measurement |
---|---|
A | Shoulder to shoulder |
B | Shoulder to elbow |
C | Elbow to wrist |
D | Floor to hip |
E | Hip to top of shoulder |
F | Top of shoulder to slightly under mouth |
G | Hipbone to hipbone |
H | Hipbone to knee |
I | Knee to middle of sole and ankle |
Setting up the Equipment
If you have already coupled your Vive Trackers to SteamVR and set the tracker roles, and do not know which tracker belongs to which body part, you can turn them on one by one and observe the tracker stati in the SteamVR controller settings -> Manage Vive Tracker. Otherwise couple your trackers one by one and assign the following roles under "Manage Vive Tracker" to them (remember or mark, which tracker belongs to which role or strap them immediately to your body):
SteamVR Role | Where to attach |
---|---|
Right/Left Foot | lower leg, slightly above ankle |
Right/Left Elbow | lower arm, as close as possible to the elbow |
Chest | center of chest (anywhere on the rib cage should be okay) |
Hip | center of hip, strap around the hip bones |
Setting up Unreal
If you get loading errors regarding Full Body IK at the start of the project, you may want to navigate to the UE engine installation folder, go to "Engine/Plugins/Experimental/FullBodyIK", open "FullBodyIK.uplugin" and change "LoadingPhase" to "PreDefault". Try restarting Unreal Engine and check in the plugin content in SaveSequenceRig, if you find the FullBodyIK node anywhere. If this does not fix the errors, try going to your .uproject file and modify the plugins section, so that it contains the following two entries.
"Plugins": [ { "Name": "ControlRig", "Enabled": true }, { "Name": "FullBodyIK", "Enabled": true } ]
If you want the mirror in the plugin map to work, you have to enable the project setting "Support global clip plane for Planar Reflections" and restart the engine. Unfortunately, the shaders have to recompile for this.
Assign following input mappings in the input action settings of your project to any buttons: "ToggleRecording", "SetMarker", "SaveAnimation".
Go to the Details Panel of the MCController and adjust the properties under "MotionCapture ...":
Motion Capture
- Scale The scale applied to the animation mesh before applying all the recorded data. Important for recording and animation translation. In the translation, the mesh is scaled back to the original scale at the end.
- Path to Measurement File The path to the file containing your measurements. It is relative to the project folder.
- Name of Recording The folder name to which the recording data will be saved after the recording and loaded from in the "Save Animation" stage.
- Enable Finger Reproduction Whether the finger data from the Index controllers should be recorded/saved.
- Finger Tracking Method "From Animation" smoothes the finger data and introduces latency, while "From Device" captures the raw data with possibly more jitter
Motion Capture Map
- Pawn This should always point to the MCPawn used in the map.
- Spectator Cam If this is not null, the computer screen will have the output from the given render target (different from the HMD). When setting up a spectator cam, you can set the target texture to the SpectatorTexture of the plugin or your own texture.
- Left Foot Plane The plane indicating where to put the left foot
- Right Foot Plane The plane indicating where to put the right foot
Motion Capture Anim
- Frames Per Second The framerate for the final animation sequence. Not relevant during the recording, only during the "Save Animation" stage. This is not necessarily the exact framerate, for example if the capture data is not dense enough (wrt. time), the framerate decreases. The time length of the capturing is always prioritized before the framerate.
- Gesture Do Holding Controls whether the hands should kept at the same position indicated by a time point. Currently, the "Gesture" options do not work without custom programming.
- Gesture Hold Scale Excess Start/End Time This will stretch the animation while translating at a certain point, so that the animation length will be added with the given seconds. Currently, the "Gesture" options do not work without custom programming.
- Gesture Hold Start/End Excess Easing Exponent Controls the time transformation of the animation stretch (if excess time is set to greater than 0). 1 will equal no ease, otherwise it will be in a smooth stop function fashion. Currently, the "Gesture" options do not work without custom programming, which has been already done in the Mocap for Audictive Stude project.
- Should Return To Default Pose At Start And End Ideal for looping/chaining animations smoothly. Returns to a default pose you can select with the properties below at the start and end of an animation.
- Default Pose File Relative From Content The last default pose is saved in Content/LastDefaultPose.txt. If you want to use another file, set it here and set the two properties below to negative values.
- Default Pose Reference Anim Time The time of the animation which you want to take as the default pose in seconds.
- Default Pose Reference Anim Part The part of the recording (the different animation files generated are the parts) which you want to take as the default pose. Begins at 1.
- Default Pose Time To Interpolate To It The time the interpolation to the default pose begins/ends relative from the animations start/end.
- Chest Control Center Relative to the average hip position after hip reduction is applied. Moves the chest, so that the average chest position is located at this center. (currently, only y value is applied)
- Chest Control Factor Controls how much influence the chest control center has over the chest position. 1 means it is shifted fully, 0 means none at all. Other values represent overshooting.
- Set Chest Transform After Arm IK Per default, after the spine and arm IK is done, the chest transform is set again to the recorded chest transform to ensure that they match. This however means, that the arms cannot reach their goal positions in many configuration. If this is desired (e.g. for avatars). this property can be turned off.
- Hip Shift To Reducing Center Controls if the hip position average should be shifted to the hip reducing center before applying hip reduction. The Hip Reducing Factor has no influence over the shifting.
- Hip Reducing Center For reducing hip position movement. This is the position around which the movement is reduced and if selected above also shifted towards. Relative to the left and right foot planes.
- Hip Reducing Factor This factor determines how strong the hip movement should be reduced. 1 would be static while 0 is not at all.
- Nr of Leg Smoothing Iterations Configures, how many times smoothing at the hip and feet input data shall be performed while translating the animation. (currently only at hip)
- Leg Smoothing Neighbor Window A greater value will increase the smooth amount per iteration.
- Lock Feet To Green Foot Indicators Whether the feet should be positioned and rotated to the foot plane transforms
- Foot Height Target The height target of the feet. Useful for translating animations to different meshes.
- Max Leg Length To reduce foot wiggling when the feet are locked to the foot indications, the body is pulled down so that the legs can handle the distance between the feet/hip. This indicates how long the recorded distance from foot to hip is allowed to be maximally before pulling down.
- Use Captured Hand Position Select if you want to use the position from the hand controllers or the rotation of the elbows for hand positioning
- Limit the hand rotation (can cause rotational jumping) Limit the rotation of the hands along their forward axis
- Finger Angle Scale Controlling how strong the hands and fingers are closed. 1 would be a fist, 0 none at all. 0.25 seems to be a good subtle closing.
- Additional Post Processing Offsets If you see, that a rotation or position looks consistently unnatural throughout an animation, you can offset the rotation or position and translate this animation again. Also useful for translating a bunch of animations with the same offset.
Recording
Start playing in VR in the plugin map. Align your body to the Henry mesh and try to also match the rotations and positions of your hands, feet and head. While being in this pose, press "ToggleRecording". This captures the offsets from the input devices to the joints of the mesh. After this, the mesh disappears. To start recording, press "SetMarker". Press "SetMarker" again any time to start a new Animation Sequence. Finally, press "ToggleRecording" to end the recording. It will be automatically saved under "Saved/Recordings/[NameOfRecording]" in the form of a data log. If there is already a recording saved with this name, a unique number will be attached to this name.
Saving a Recording as an Animation Sequence
Go to the details panel of the MCController actor and update the "Name of Recording" property to match the recording you want to translate into an Animation Sequence (if you are in the same play session as the recording, you do not need to do this). If the name was changed during recording as consequence of adding a unique number at the end, you have to update this setting here. Now you can play (not necessarily in VR) and press "SaveAnimation". Wait for the on Screen log message confirming your animation is created or wait until a new folder appears in the content folder of the project.
If the Animation Sequence is faulty in there being no end point or it looks corrupted, try restarting UE4 and translating the animation again. Sometimes, when forcefully stopping a play session, the animation stopped time stamp is not written in the correct log. After restarting, the MCController tries to recover it when trying to translate an animation.
If an Animation Sequence seems to be frozen, try to set and unset the checkbox "Alow Frame Stripping" in the Asset Details Panel of the Animation Sequence.
If an Animation Sequence seems to have a fixed hip position and some parts (feet) slide around, go to the Skeleton Tree tab, select Options->Show Retargeting Options and change the Translation Retargeting for pelvis to Animation or Animation Scaled. When saving the animation, this is automatically done, but since the Translation Retargeting property is a property of the skeleton and not the animation sequence, this property could have been changed after saving the animation.
Tips
Try avoiding super reactive and fast movements. Especially when doing steps, the trackers can jiggle causing strange results. There are countermeasures implemented for the hip but it is better to avoid this.
Generally, the more single recording sessions you do, the more tweeking of settings for translating the animation could be necessary. To reduce this you can record multiple animations in one recording and also write down the settings you did for that recording session, so you do not have to try them out again from the beginning. The negative effect of this is that translating itself takes longer since you have to translate multiple animations.
For tweaking the settings, here are a few tips:
Problem in animation | What to do |
---|---|
Hip does not move, legs slide | In Animation Asset: Skeleten Tree->Options->Show Retargeting Options->Translation Retargeting for pelvis to Animation or Animation Scaled |
Frozen | Set and unset checkbox "Alow Frame Stripping" in Asset Details Panel of Animation |
Legs or Hip jiggles / too smooth | Play with Leg Smoothing settings |
Hand behaves strangely in some motions | Turn off Captured Hand Position option |
Hand in strange position offset | Turn the Captured Hand Position option on or off or play with Additional Post Processing Offsets |
Hand rotates too far but the standard rotation is good | Turn on Limit Hand Rotation |
Hand rotation jumps | Turn off Limit Hand Rotation |
Body parts seem offset (pos or rot) | Additional Post Processing Offsets |
Shoulders are too off | There is an experimental feature in the SaveSequenceRig, go there and disable the variable "NoShoulderLimit", you can adjust "MaxShoulderOffDist" to set the positional limit of the shoulder compared to its relative initial position |
Modificators
Jonas introduced in his master thesis some modificators that can be used to improve gestures during the pipeline. These were incorporated in the MoCap plugin. To use them just add desired modificators to the scene and then add them to the modificators list (they are executed from top to buttom one after the other). For more information on good choices and to experiment with the modificators on a single file refer to his thesis project
Advanced Notes
- The transformation of the MCPawn matters. Right now, it has to be located at (0, 0, z) for the offset capture to work. This is relevant, if you want to change the recording map or record in your own map.
- If you want the gesture hold options to work, then look at the method "SaveAnimation" in MCController.cpp. Currently, no parameters are given. Call this method somehow with a time point in seconds. The first point in the array will be taken as the holding and start of scaling point.
- If you are in a shipping build, recordings are saved to Appdata/Local/ProjectName