Integrate VIVE OpenXR Facial Tracking with MetaHuman
OpenXR Facial Tracking Plugin Setup
Supported Unreal Engine version: 4.26 +
Please enable plugin in Edit > Plugins > Virtual Reality:
- Please enable plugin in Edit > Plugins > Virtual Reality:
- The "Steam VR" plugin must be disabled for OpenXR to work.
Please disable plugin in Edit > Plugins > Virtual Reality:
- Please disable plugin in Edit > Plugins > Virtual Reality:
Please make sure the “
OpenXR Facial Tracking extension
” is enabled, the setting is in Edit > Project Settings > Plugins > Vive OpenXR > Facial Tracking:
- Enable Facial Tracking
- Please make sure the “ OpenXR Facial Tracking extension ” is enabled, the setting is in Edit > Project Settings > Plugins > Vive OpenXR > Facial Tracking:
Initial / Release Facial Tracker
If we want to use OpenXR Facial Tracking, we will need to do some steps to make sure the OpenXR Facial Tracking process is fine.
Initial Facial Tracker
Create Facial Tracker
- We can pass in (or select) Eye or Lip in the input of this function to indicate the tracking type to be created.
- This function is usually called at the start of the game.
- Create Facial Tracker
Release Facial Tracker
Destroy Facial Tracker
- This function does not need to select which tracking type need to be released, it will confirm by itself which tracking types need to be released.
- This function is usually called at the end of the game.
- Destroy Facial Tracker
Get Eye / Lip Facial Expressions Data
Getting Detection Result.
Detected Eye or Lip expressions results are available from blueprint function “ GetEyeFacialExpressions ” or “ GetLipFacialExpressions ”.
Feed OpenXR FacialTracking Data to MetaHuman
Update data from “ GetEyeFacialExpressions ” or “ GetLipFacialExpressions ” to MetaHuma’s facial expressions.
Note: This tutorial will be presented using Blueprint and Animation Blueprint.
- Follow the MetaHuman Creator – Unreal Engine to import your MetaHuman.
MetaHuman folder structure:
All > Content > MetaHumans > “Your MetaHuman”.
Vive Wave OpenXR – Windows Content
include the following assets:
- Animation :
- When you finish importing your MetaHuman, you may find that the three anim-related assets under the Animation folder we provide will need to do the steps of Retarget Skeleton . If you have this requirement, please follow the following steps:
Note: Take XR_Face_AnimBP as an example, the steps for the other two assets are the same.
- Double click XR_Face_AnimBP , click Yes when the Message Window pop up.
- Choose “ Face_Archetype_Skeleton ” and click OK (This is the Skeleton of MetaHuman.)
Get the Your MetaHuman Ready
to your MetaHuman Blueprint:
You can find
Your MetaHuman Blueprint
- All > Content > MetaHuman > “ Your MetaHuman ” > “ BP_YourMetaHuman ”
- You can find Your MetaHuman Blueprint in:
- Set XR_Face_AnimBP to your MetaHuman Blueprint:
- In Components panel click “ Face ” > Details panel > Animation > Anim Class: XR_Face_AnimBP
- Go back to Level Editor default Interface, we will drag the “ BP_YourMetaHuman ” into Level.
Briefly introduce the functions process of XR_Face_AnimBP .
- We have added GetEyeFacialExpressions and GetLipFacialExpressions for get eye and lip facial tracking data in EventGraph.
- We will update each pose weight of mh_XR_FacialTracking_mapping_pose with the data of eye and lip in AnimGraph.
- Drag the BP_XR_FT_Manager into Level
- BP_XR_FT_Manager will Handle Initial / Release Facial Tracker.
- Note: This blueprint functions are the same as the “ FT_Framework ” in the AvatarSample tutorial.
- Plugins > Vive Wave OpenXR – Windows Content > MetaHumanAssets > Blueprint > BP_XR_FT_Manager
Drag BP_XR_FT_Manager into the level:
- You can decide if you need to enable or disable Eye and Lip in:
Outliner panel > click BP_XR_FT_Manager > Details panel > Facial Tracking > Enable Eye / Enable Lip
Briefly introduce the functions process of BP_XR_FT_Manager .
- We have created StartEyeFramework and StartLipFramework for initial eye and lip facial tracker.
- We have created StopAllFrameworks for release eye and lip facial tracker.