Please Select Your Location
Australia
Österreich
België
Canada
Canada - Français
中国
Česká republika
Denmark
Deutschland
France
HongKong
Iceland
Ireland
Italia
日本
Korea
Latvija
Lietuva
Lëtzebuerg
Malta
المملكة العربية السعودية (Arabic)
Nederland
New Zealand
Norge
Polska
Portugal
Russia
Saudi Arabia
Southeast Asia
España
Suisse
Suomi
Sverige
台灣
Ukraine
United Kingdom
United States
Please Select Your Location
België
Česká republika
Denmark
Iceland
Ireland
Italia
Latvija
Lietuva
Lëtzebuerg
Malta
Nederland
Norge
Polska
Portugal
España
Suisse
Suomi
Sverige

VRM


Introduction

VRM Model assists developers to create content with actual facial expressions. (more)


Supported Platform and devices

Platform Headset Supported
PC PC Streaming Focus3/ XR Elite V
Pure PC Vive Cosmos X
Vive Pro series V
AIO Focus3/ XR Elite V


Inside ViveVRM > Blueprint, includes a Manager Actor, LiveLinkPreset and Enumeration.

uefacialexpressionmakerblueprint.png

Inside ViveVRM > Animation, includes a Sequence, Pose Asset and Animation Blueprint.

uefacialexpressionmakeranimation.png


Enable Plugins

  1. Edit > Plugins > Search for OpenXR and ViveOpenXR, and make sure they are enabled.
  2. Edit > Plugins > Built-in > Virtual Reality > OpenXREyeTracker, enable Eye tracker. Note that the " SteamVR " and " OculusVR " plugin must be disabled for OpenXR to work. Restart the engine for the changes to take effect.


Integrate VIVE OpenXR Facial Tracking with VRM Model

  1. Make sure the ViveOpenXR plugin is already installed, along with the OpenXREyeTracker enabled.
  2. Select Edit > Project Settings > Plugins > Vive OpenXR > Enable Facial Tracking under Facial Tracking to enable OpenXR Facial Tracking extension.
  3. Download VRM4U plugin and put it in your project > Plugins.

  4. Enable the plugin in your project > Plugins > PROJECT > VRM4U and restart the unreal project.

    pluginvrm4urestart.png

  5. Import your VRM Model via drag the *.vrm file. You may create your avatar by VRoid Studio.

    ueimportvrmfileoption

  6. Under ViveVRM > Animation, double click on the VRM_XR_FacialTracking_mapping_pose, VRM_XR_FacialTracking_mapping_anim and animation blueprint VRM_XR_Face_AnimBP. If there is a following pop up window says Could not find skeleton, click Yes to link your VRM face skeleton to the animation blueprint.

    vrmanimbpyes1.png

    For example the VRM file is ModelFacialVRM1.vrm you need to select the face skeleton SKEL_ModelFacialVRM1.

    uevrmskeleton1.png

    Open up the Animation Blueprint AnimGraph and make sure the Live Link Pose is set to Eye and Lip.

    ueanimbpeyelipgirl

  7. Double click on the VRM_XR_FacialTracking_mapping_anim if there is a following pop up window says Could not find skeleton, click Yes to link your VRM face skeleton to the animation blueprint.

    vrmmapanimyes.png

    For example the VRM file is ModelFacialVRM1.vrm you need to select the SKEL_ModelFacialVRM1.

    uevrmskeleton1.png

    Drag your VRM_XR_Face_AnimBP into the map. Make sure the Anim Class is VRM_XR_Face_AnimBP.

    animbpinmap

    Next drag the BP_XR_FT_Manager under ViveVRM > Blueprint into the map. Make sure the Live Link preset is ViveOpenXRFacialLiveLinkPreset and Save all.
    BP_XR_FT_Manager will pass the ViveOpenXRFacialLiveLinkPreset to Live Link Client.

    uevivelivelinkpreset.png

  8. Package the App and run in your VR device with all the facial tracking device connected check facial expression maker result.

    uefacialexpmakerdisplay.png


Troubleshooting

If you encounter the compile error or WARNING related to VRM4U Util in Unreal editor please delete below files.(refer to)
The latest folder in Plugins > VRM4U > Content > Util > Actor.
The latest folder in Plugins > VRM4U > Content > Maps.
The M_CenterBlur.uasset in Plugins > VRM4U > Content > Util > Actor > Post > sub.