Please Select Your Location
Australia
Österreich
België
Canada
Canada - Français
中国
Česká republika
Denmark
Deutschland
France
HongKong
Iceland
Italia
日本
Korea
Latvija
Lietuva
Lëtzebuerg
Malta
المملكة العربية السعودية (Arabic)
Nederland
New Zealand
Norge
Polska
Portugal
Russia
Saudi Arabia
Southeast Asia
España
Suisse
Suomi
Sverige
台灣
Ukraine
United Kingdom
United States

Integrate Facial Tracking Data With Your Avatar

Note: Before you start, please install or update the latest VIVE Software from SteamVR or OOBE, and check your SR_Runtime

OpenXR Facial Tracking Plugin Setup

Supported Unreal Engine version: 4.26 +

  • Enable Plugins:
    • Please enable plugin in Edit > Plugins > Virtual Reality:
      • OpenXR
image1.png
  • Disable Plugins:
    • The "Steam VR" plugin must be disabled for OpenXR to work.
    • Please disable plugin in Edit > Plugins > Virtual Reality:
      • SteamVR
image2.png
  • Project Settings:
    • Please make sure the “ OpenXR Facial Tracking extension ” is enabled, the setting is in Edit > Project Settings > Plugins > Vive OpenXR > Facial Tracking:
      • Enable Facial Tracking
image3.png

Initial / Release Facial Tracker

If we want to use OpenXR Facial Tracking, we will need to do some steps to make sure the OpenXR Facial Tracking process is fine.

  • Initial Facial Tracker
    • Create Facial Tracker
      • We can pass in (or select) Eye or Lip in the input of this function to indicate the tracking type to be created.
      • This function is usually called at the start of the game.
image4.png
    • Release Facial Tracker
      • Destroy Facial Tracker
        • This function does not need to select which tracking type need to be released; instead, it will confirm by itself which tracking types need to be released.
        • This function is usually called at the end of the game.

image5.png

Get Eye / Lip Facial Expressions Data

Getting Detection Result.

Detected Eye or Lip expressions results are available from blueprint function “ GetEyeFacialExpressions ” or “ GetLipFacialExpressions ”.

image7.png
image8.png image9.png

Feed OpenXR FacialTracking Eye & Lip Data to Avatar

Use data from “ GetEyeFacialExpressions ” or “ GetLipFacialExpressions ” to update Avatar’s eye shape or lip shape.

Note: The Avatar used in this tutorial is the model in the ViveOpenXR sample.

Note: This tutorial will be presented using C++.

  • Essential Setup
    • Import head and eyes skeleton models/textures/materials.
image10.png
image11.png image12.png

  • FT_Framework: Handle initial / release Facial Tracker
    • Create new C++ Class:
    • Content Browser > All > C++ Classes > click right mouse button > choose New C++ Class…
image13.png
  • After choose New C++ Class, it will pop out “ Add C++ Class ” window, in CHOOSE PARENT CLASS step:
    • Choose “ None ” and press “ Next ”.
image14.png
  • In NAME YOUR NEW CLASS step:
    • Class Type: Private
    • Name: FT_Framework
    • After the previous settings are completed, press Create Class .
image15.png

  • Open FT_Framework.h
    • Add the following #include statements at the top.
      • FT_Framework.h
      • image16.png
    • Add the following Properties under public :
      • FT_Framework.h
      • image17.png
    • Add the following Properties under private:
      • FT_Framework.h
      • image18.png

  • Open FT_Framework.cpp
    • Add the following code underneath the line #include "FT_Framework.h".
      • Initialize m_Framework .
        • FT_Framework.cpp
        • image19.png
    • Add the following implementation for Instance() function:
      • Get an instance of FT_Framework class or create an instance if FT_Framework class instance not exist.
        • FT_Framework.cpp
        • image20.png
    • Add the following implementation for DestroyFramework() function:
      • Delete instance of FT_Framework class.
        • FT_Framework.cpp
        • image21.png

  • Add the following implementation for StartEyeFramework() function:
    • This function will call CreateFacialTracker(Eye) from ViveOpenXR plugin to handle initial eye tracker.
      • FT_Framework.cpp
      • image22.png

  • Add the following implementation for StartLipFramework() function:
    • This function will call CreateFacialTracker(Lip) from ViveOpenXR plugin to handle initial eye tracker.
    • FT_Framework.cpp
    • image23.png

  • Add the following implementation for StopAllFramework() function:
    • This function will call DestroyFacialTracker() from ViveOpenXR plugin to handle release all facial tracker.
    • FT_Framework.cpp
    • image24.png

  • FT_AvatarSample: Update Facial Expression shape of Avatar skeletal mesh via OpenXR FacialTracking results.
    • Create new C++ Class:
    • Content Browser > All > C++ Classes > click right mouse button > choose New C++ Class…
    • image13.png
  • After choose New C++ Class, it will pop out “ Add C++ Class ” window, in CHOOSE PARENT CLASS step:
    • Choose “ Actor ” and press “ Next ”.
    • image25.png
  • In NAME YOUR NEW CLASS step:
    • Class Type: Public
    • Name: FT_AvatarSample
    • After the previous settings are completed, press Create Class .
    • image14.png
  • Open FT_ AvatarSample.h
    • Add the following #include statements at the top.
      • FT_ AvatarSample.h
        image27.png
    • Add the following Properties under public :
      • FT_ AvatarSample.h
        • image26.png

  • Add the following Properties under private:
    • FT_AvatarSample.h
    • image28.png

  • Open FT_AvatarSanple. cpp
    • Add the following #include statements at the top .
      • FT_AvatarSample.cpp
      • image29.png
    • Add the following code inside of the constructor for AFT_AvatarSample :
      • Definition for the Skeletal Mesh that will serve as our visual representation.
        • FT_AvatarSample.cpp
        • image30.png

  • Add the following implementation for BeginPlay() function:
    • Create eye and lip facial tracker.
      • FT_AvatarSample.cpp
      • image31.png
    • Store the result corresponding to the eye blend shapes(key) of the avatar and OpenXRFacialTracking eye expressions(value) in the EyeShapeTable .
      • FT_AvatarSample.cpp
      • image32.png

  • Store the result corresponding to the lip blend shapes(key) of the avatar and OpenXRFacialTracking lip expressions(value) in the LipShapeTable .
    • FT_AvatarSample.cpp
    • image33.png
    • We can find blend shapes in Avatar skeleton mesh asset.
    • image34.png
  • Add the following implementation for RenderModelShape() function:
    • Render the result of face tracking to the avatar's blend shapes.
      • FT_AvatarSample.cpp
      • image35.png
  • Add the following implementation for UpdateGazeRay() function:
    • This function will calculate the gaze direction to update the eye's anchors rotation to represent the direction of the eye's gaze.
      • FT_AvatarSample.cpp
      • image36.png
      • image37.png


  • Add the following implementation for Tick(float DeltaTime) function:
    • Update avatar’s Eye and Lip shapes every frame.
    • Update eyes gaze direction.
      • FT_AvatarSample.cpp
      • image38.png

  • Add the following implementation for EndPlay(const EEndPlayReason::Type EndPlayReason) function:
    • Release Eye and Lip Facial Tracker.
    • FT_AvatarSample.cpp
    • image39.png

  • Add FT_AvatarSample to game level:
    • Create Blueprint class based on FT_AvatarSample :
      • Content Browser > All > C++ Classes > Public > click right button on FT_AvatarSample > choose Create Blueprint class base on FT_AvatarSample
      • image40.png
    • In NAME YOUR NEW FT AVATAR SAMPLE step:
      • Name: BP_FT_AvatarSample
      • Path: All > Content > Blueprints
      • After the previous settings are completed, press Create Blueprint Class .
      • image41.png

  • After Blueprint class is created:
    • Open the blueprint in All > Content > Blueprints , we will find the BP_FT_AvatarSample .
    • After opening it, confirm that Parent class is FT_AvatarSample .
    • image42.png
    • You can see that there are components from the C++ parent in the Component panel
    • image43.png

  • In Components panel, we will set head and eyes skeletal mesh for each component:
    • Set Hand Skeletal Mesh in HeadModel > Details panel > Mesh > Skeletal Mesh.
      • image44.png image45.png
    • Set Left Eye Skeletal Mesh in EyeModel_L > Details panel > Mesh > Skeletal Mesh.
      • image44.png image46.png

  • Set Right Eye Skeletal Mesh in EyeModel_R > Details panel > Mesh > Skeletal Mesh.
    • image44.png image47.png

  • Go back to editor HOME interface, we will drag the “ BP_FT_AvatarSample ” into Level.
    • You can decide if you need to enable or disable Eye and Lip in:
      • Outliner panel > click BP_FT_AvatarSample > Details panel > Facial Tracking Settings > Enable Eye / Enable Lip .
      • image48.png

Result

image49.png