:Eye Tracking will obviously only work with a headset supporting eye tracking (like Vive Pro Eye
or HP Reverb G2 Omnicapt
). Just head orientation tracking will always work!
There is a GazeTest map in the Study Framework Demo project.
Usage
- In the
StudySetup
Actor setUseGazeTracker
toEyeTracking
(if you want to use the actual eye tracker) orHeadRotationOnly
(which just approximates the gaze direction by the head's forward direction; it is the fall-back if no eye tracker is found andEyeTracking
is chosen). - Choose wich
GazeTrackingBackend
to use (for specifics look at the respective sub-pages) - Object's that should be "gazeable" have to have one of the two components:
-
SFGazeTarget
: This has it's own sphere collision which is used for line trace checks (so use this if only part of an actor should be gazeable or even a region greater than the actor itself). -
SFGazeTargetActor
: The whole actor is used for line trace checks. - both either use the name specified as
TargetName
or ifUSeActorName
is activated, the name of the actor they are attached to.
-
- The GazeTracker (e.g.
USFGameInstance::Get()->GetGazeTracker()
) provides:-
GetCurrentGazeTarget()
: returning the name (TargetName
or actor name) of the currently gaze at target or an empty string if nothing "gazeable" is looked at. -
LaunchCalibration()
to launch a calibration from code. - logs the currently gaze at target name and the gaze direction per frame into a separate file per participant. The participant's gaze tracking logs are located in the StudyLogs/GazeTrackingLogs folder. For head rotation logs, please refer to the Logging the Position of the Player section of the wiki.
-
Prerequisite
If you just want to use HeadRotationOnly
as tracking mode, the following is not required.
Otherwise look at OpenXR Eye Tracking prerequisites or SRanipal prerequisites