Please Select Your Location
Australia
Österreich
België
Canada
Canada - Français
中国
Česká republika
Denmark
Deutschland
France
HongKong
Iceland
Ireland
Italia
日本
Korea
Latvija
Lietuva
Lëtzebuerg
Malta
المملكة العربية السعودية (Arabic)
Nederland
New Zealand
Norge
Polska
Portugal
Russia
Saudi Arabia
Southeast Asia
España
Suisse
Suomi
Sverige
台灣
Ukraine
United Kingdom
United States
Please Select Your Location
België
Česká republika
Denmark
Iceland
Ireland
Italia
Latvija
Lietuva
Lëtzebuerg
Malta
Nederland
Norge
Polska
Portugal
España
Suisse
Suomi
Sverige

Facial Tracking

Extension introduction

XR_HTC_facial_tracking allows developers to create an application with actual facial expressions on 3D avatars.

Supported Platform and Devices

Platform Headset Supported Plugin Version
PC PC Streaming Focus 3/XR Elite/Focus Vision V 2.0.0 and above
Pure PC Vive Cosmos X
Vive Pro series V 2.0.0 and above
AIO Focus 3/XR Elite/Focus Vision V 2.0.0 and above


Enable Plugins

  1. Edit > Plugins > Search for OpenXR and ViveOpenXR, and make sure they are enabled.
  2. Edit > Plugins > Built-in > Virtual Reality > OpenXREyeTracker
  3. Note that the " SteamVR " and " OculusVR " plugin must be disabled for OpenXR to work.
  4. Restart the engine for the changes to take effect


How to use OpenXR Facial Tracking Unreal Feature

  1. Please make sure ViveOpenXR is enabled
  2. Edit > Project Settings > Plugins > Vive OpenXR > Click Enable Facial Tracking under Facial Tracking to enable OpenXR Facial Tracking extension.
  3. Restart the engine to apply new settings after clicking Enable Facial Tracking .
  4. For the available FacialTracking functions, please refer to ViveOpenXRFacialTrackingFunctionLibrary.cpp.
  5. Type Facial Tracking to get the Facial Tracking blueprint functions your content needs.
    facialtracking_howto_5

    1. Get Eye Facial Expressions
      facialtracking_howto_5_1

      Provides the blend shapes of the eyes.
    2. Get Lip Facial Expressions
      facialtracking_howto_5_2

      Provides the blend shapes of the lip.


Play the sample map

  1. Make sure the OpenXR Facial Tracking extension is enabled, the setting is in Edit > Project Settings > Plugins > Vive OpenXR .
  2. The sample map is under Content > FacialTracking > Map .
    facialtracking_play_2
  3. Start playing the FacialTracking map, you can see the avatar's expression change.