Please Select Your Location
Australia
Österreich
België
Canada
Canada - Français
中国
Česká republika
Denmark
Deutschland
France
HongKong
Iceland
Ireland
Italia
日本
Korea
Latvija
Lietuva
Lëtzebuerg
Malta
المملكة العربية السعودية (Arabic)
Nederland
New Zealand
Norge
Polska
Portugal
Russia
Saudi Arabia
Southeast Asia
España
Suisse
Suomi
Sverige
台灣
Ukraine
United Kingdom
United States
Please Select Your Location
België
Česká republika
Denmark
Iceland
Ireland
Italia
Latvija
Lietuva
Lëtzebuerg
Malta
Nederland
Norge
Polska
Portugal
España
Suisse
Suomi
Sverige

MetaHuman


Introduction

MetaHuman assists developers to create content with actual facial expressions. (more)


Supported Platforms and devices

Platform Headset Supported
PC PC Streaming Focus 3/XR Elite V
Pure PC Vive Cosmos X
Vive Pro series V
AIO Focus 3/XR Elite V


  1. Inside ViveMetaHuman > Blueprint, includes a Manager Actor, LiveLinkPreset and Enumeration.
    MetaHumanBP.png
  2. Inside ViveMetaHuman > Animation, includes a Sequence, Pose Asset and Animation Blueprint.
    MetaHumanAnimBP.png


Enable Plugins

  1. Edit > Plugins > Search for OpenXR and ViveOpenXR, and make sure they are enabled.
  2. Edit > Plugins > Built-in > Virtual Reality > OpenXREyeTracker, enable Eye tracker.
  3. Note that the " SteamVR " and " OculusVR " plugin must be disabled for OpenXR to work.
  4. Restart the engine for the changes to take effect.


Integrate VIVE OpenXR Facial Tracking with MetaHuman

  1. Make sure the ViveOpenXR plugin is already installed, along with the OpenXREyeTracker enabled.
  2. Select Edit > Project Settings > Plugins > Vive OpenXR > Enable Facial Tracking under Facial Tracking to enable OpenXR Facial Tracking extension.
  3. Import your MetaHuman.

    i. Inside Window dropdown menu, open the Quixel Bridge.
    QuixelBridge.png

    ii. Pick a MetaHuman you like, download it and add to your project.

    iii. After importing the MetaHuman, some guidelines will pop out.
    Ignore the one that includes the Apple ARKit Facial support, press Dismiss.


    Enabled the Render Settings, Press Enable Missing.

    Enabled the Groom plugin, Press Enable Missing.

  4. Under ViveMetaHuman > Animation, double click on the animation blueprint XR_Face_AnimBP.
    ClickAnimBP.png
    If there is a following pop up window says Could not find skeleton, click Yes to link your MetaHuman face skeleton to the animation blueprint.
    SkeletonWarning.png
    Then select the Face_Archetype_Skeleton.
    ChooseSkeleton.png

  5. Open up the Animation Blueprint AnimGraph and check if the Live Link Pose is set to Eye and Lip.

    AnimGraphLiveLinkPose.png

  6. Open up BP_Ada(Your MetaHuman name) under MetaHuman > Ada(Your MetaHuman name folder).
    ClickAda.png

  7. Change the Face Anim Class to XR_Face_AnimBP.

  8. Drag your MetaHuman into the map.

  9. Next drag the BP_XR_FT_Manager under ViveMetaHumant > Blueprint into the map.
    FTManager.png


    BP_XR_FT_Manager will pass the ViveOpenXRFacialLiveLinkPreset to Live Link Client.

    LiveLinkPreset.png

  10. With all the facial tracking device connected, press play to see the result.
    SampleResult.pngv