Please Select Your Location
Australia
Österreich
België
Canada
Canada - Français
中国
Česká republika
Denmark
Deutschland
France
HongKong
Iceland
Italia
日本
Korea
Latvija
Lietuva
Lëtzebuerg
Malta
المملكة العربية السعودية (Arabic)
Nederland
New Zealand
Norge
Polska
Portugal
Russia
Saudi Arabia
Southeast Asia
España
Suisse
Suomi
Sverige
台灣
Ukraine
United Kingdom
United States

Integrate VIVE OpenXR Facial Tracking with MetaHuman

OpenXR Facial Tracking Plugin Setup

Supported Unreal Engine version: 4.26 +

  • Enable Plugins:
    • Please enable plugin in Edit > Plugins > Virtual Reality:
      • OpenXR
image1.png
  • Disable Plugins:
    • The "Steam VR" plugin must be disabled for OpenXR to work.
    • Please disable plugin in Edit > Plugins > Virtual Reality:
      • SteamVR
image2.png

  • Project Settings:
    • Please make sure the “ OpenXR Facial Tracking extension ” is enabled, the setting is in Edit > Project Settings > Plugins > Vive OpenXR > Facial Tracking:
      • Enable Facial Tracking
image3.png

Initial / Release Facial Tracker

If we want to use OpenXR Facial Tracking, we will need to do some steps to make sure the OpenXR Facial Tracking process is fine.

  • Initial Facial Tracker
    • Create Facial Tracker
      • We can pass in (or select) Eye or Lip in the input of this function to indicate the tracking type to be created.
      • This function is usually called at the start of the game.
image4.png

  • Release Facial Tracker
    • Destroy Facial Tracker
      • This function does not need to select which tracking type need to be released, it will confirm by itself which tracking types need to be released.
      • This function is usually called at the end of the game.

image5.png

Get Eye / Lip Facial Expressions Data

Getting Detection Result.

Detected Eye or Lip expressions results are available from blueprint function “ GetEyeFacialExpressions ” or “ GetLipFacialExpressions ”.

image6.png image7.png

image8.png image9.png

Feed OpenXR FacialTracking Data to MetaHuman

Update data from “ GetEyeFacialExpressions ” or “ GetLipFacialExpressions ” to MetaHuma’s facial expressions.

Note: This tutorial will be presented using Blueprint and Animation Blueprint.

MetaHuman folder structure:

All > Content > MetaHumans > “Your MetaHuman”.


image10.png
  • Make sure Plugins > Vive Wave OpenXR – Windows Content > MetaHumanAssets include the following assets:
    • Animation :
      • XR_Face_AnimBP
      • mh_XR_FacialTracking_mapping_pose
      • mh_XR_FacialTracking_mapping_anim
image11.png

  • Blueprint :
    • BP_XR_FT_Manager
    • XR_FT_StatusEnum
image12.png

  • When you finish importing your MetaHuman, you may find that the three anim-related assets under the Animation folder we provide will need to do the steps of Retarget Skeleton . If you have this requirement, please follow the following steps:

Note: Take XR_Face_AnimBP as an example, the steps for the other two assets are the same.

  • Double click XR_Face_AnimBP , click Yes when the Message Window pop up.
image13.png
  • Choose “ Face_Archetype_Skeleton ” and click OK (This is the Skeleton of MetaHuman.)
image14.png

  • Get the Your MetaHuman Ready
    • Set XR_Face_AnimBP to your MetaHuman Blueprint:
      • You can find Your MetaHuman Blueprint in:
        • All > Content > MetaHuman > “ Your MetaHuman ” > “ BP_YourMetaHuman
image15.png

  • Open “BP_YourMetaHuman”
    • In Components panel click “ Face ” > Details panel > Animation > Anim Class: XR_Face_AnimBP
image16.png

  • Go back to Level Editor default Interface, we will drag the “ BP_YourMetaHuman ” into Level.
image17.png

Briefly introduce the functions process of XR_Face_AnimBP .

-        We have added GetEyeFacialExpressions and GetLipFacialExpressions for get eye and lip facial tracking data in EventGraph.

-        We will update each pose weight of mh_XR_FacialTracking_mapping_pose with the data of eye and lip in AnimGraph.

image18.png

image19.png

  • Drag the BP_XR_FT_Manager into Level
  • BP_XR_FT_Manager will Handle Initial / Release Facial Tracker.
  • Note: This blueprint functions are the same as the “ FT_Framework ” in the AvatarSample tutorial.
  • BP_XR_FT_Manager‘s Path:
    • Plugins > Vive Wave OpenXR – Windows Content > MetaHumanAssets > Blueprint > BP_XR_FT_Manager
image12.png

  • Drag BP_XR_FT_Manager into the level:
    • You can decide if you need to enable or disable Eye and Lip in:

Outliner panel > click BP_XR_FT_Manager > Details panel > Facial Tracking > Enable Eye / Enable Lip

image20.png

Briefly introduce the functions process of BP_XR_FT_Manager .

-        We have created StartEyeFramework and StartLipFramework for initial eye and lip facial tracker.

-        We have created StopAllFrameworks for release eye and lip facial tracker.



image21.png

image22.png

Result

image23.png