Getting Data of Facial Tracking
⚠ NOTICE |
This legacy plugin is no longer being updated and maintained, please develop mobile content with OpenXR 2-in-1 Unity / Unreal package . |
Hello, in this section, we are going to talk about how to get the data from the VIVE Facial Tracking feature. The Facial Tracking feature provided by VIVE OpenXR allows you to get the current facial expression of the player. The notion of getting such information seems complex, but it is in fact quite simple to understand if we break it down. As you can see in the following picture, the Facial Tracking feature is merely a combination of Eye Expression and Lip Expression.
In VIVE Facial Tracking, the data from Eye Expression and Lip Expression are described with enums.
For example, the value associated with the enum XR_EYE_EXPRESSION_LEFT_BLINK_HTC tells how large the left eye of the player is opening. If the value is approaching to 1, it means the eye is closing, if otherwise, it means the eye is opening.
The same notion applies when we are using the data from Lip Expression. For example, the value associated with the enum XR_LIP_EXPRESSION_JAW_OPEN_HTC indicates how large the player's mouth is opening.
Therefore, as you can see, when we are trying to get the data from the Facial Tracking feature, what we are actually doing is to get the data from Eye Expression and Lip Expression simultaneously.
By the way, I encourage you to have a look at the khronos OpenXR website for a further understanding of what each enum means.
Now that we have a brief understanding of what the data provided by Facial Tracking provided represents, next, let's set up your project for Facial Tracking.
Set up Your Project to use Facial Tracking
Before we start, if you haven't set up your project for running on VIVE devices using OpenXR, please go check this session.
Step 1. Check your VIVE OpenXR Plugin version
In Window > Package Manager, make sure your VIVE OpenXR Plugin - Android is 1.0.3 or newer.
Step 2. Enable the Facial Tracking feature
In Edit > Project Settings > XR Plug-in Management > OpenXR, enable the VIVE Focus3 Facial Tracking feature.
How to get the Data of Facial Tracking
Now that we have a brief understanding of what the data provided by Facial Tracking provided represents, and made sure that your project is all set, let me show you what we need to do in order to get the data.
P.S. You can also go check the API Reference of Facial Tracking here.
Step 1. Declare two float arrays of float to store the data from Eye Expression and Lip Expression
In any script in which you wish to get the data, prepare two arrays to store the data from Eye Expression and Lip Expression.
private static float[] eyeExps = new float[(int)XrEyeExpressionHTC.XR_EYE_EXPRESSION_MAX_ENUM_HTC];
private static float[] lipExps = new float[(int)XrLipExpressionHTC.XR_LIP_EXPRESSION_MAX_ENUM_HTC];
Step 2. Get the feature instance
To use the feature, get the feature instance with the following line of code
var feature = OpenXRSettings.Instance.GetFeature<ViveFacialTracking>();
Step 3. Put the data into the two arrays
Use the function GetFacialExpression to get the data and put them into the two arrays, eyeExps and lipExps.
void Update()
{
var feature = OpenXRSettings.Instance.GetFeature();
if (feature != null)
{
// Eye expressions
{
if (feature.GetFacialExpressions(XrFacialTrackingTypeHTC.XR_FACIAL_TRACKING_TYPE_EYE_DEFAULT_HTC, out float[] exps))
{
eyeExps = exps;
}
}
// Lip expressions
{
if (feature.GetFacialExpressions(XrFacialTrackingTypeHTC.XR_FACIAL_TRACKING_TYPE_LIP_DEFAULT_HTC, out float[] exps))
{
lipExps = exps;
}
}
}
//How large is the user's mouth is opening. 0 = closed 1 = full opened
Debug.Log("Jaw Open: " + lipExps[(int)XrLipExpressionHTC.XR_LIP_EXPRESSION_JAW_OPEN_HTC]);
//Is the user's left eye opening? 0 = opened 1 = full closed
Debug.Log("Left Eye Blink: " + eyeExps[(int)XrEyeExpressionHTC.XR_EYE_EXPRESSION_LEFT_BLINK_HTC]);
}