Please Select Your Location
Australia
Österreich
België
Canada
Canada - Français
中国
Česká republika
Denmark
Deutschland
France
HongKong
Iceland
Ireland
Italia
日本
Korea
Latvija
Lietuva
Lëtzebuerg
Malta
المملكة العربية السعودية (Arabic)
Nederland
New Zealand
Norge
Polska
Portugal
Russia
Saudi Arabia
Southeast Asia
España
Suisse
Suomi
Sverige
台灣
Ukraine
United Kingdom
United States
Please Select Your Location
België
Česká republika
Denmark
Iceland
Ireland
Italia
Latvija
Lietuva
Lëtzebuerg
Malta
Nederland
Norge
Polska
Portugal
España
Suisse
Suomi
Sverige

Hand Tracking

Overview

In previous chapters, we discussed how to use the controller as the input to interact with objects in XR. However, as a developer, I think it is important for us to create contents as immersive as possible. I mean, wasn’t it the first reason that drove us to the field of XR? — To create content which the player can interact with in person, instead of indirectly via an avatar like they do in other media, such as PCs or consoles.

In this chapter, I will show you how we can create more immersive experiences by using the Hand Tracking feature. Currently, in VIVE XR HandTracking extension, the SDK provides the following features:

• Hand Tracking: Enable applications to locate the individual joints of hand tracking inputs and render hands.
• Hand Interaction: Enable applications to aim at virtual objects and determine if a player is selecting something based on the player’s finger pinch strength.

Development Environment Requirements

Currently, the Unity Editor doesn’t provide a default Hand Tracking interface. Therefore, in this chapter, we’ll be using the VIVE XR HandTracking extension to use Hand Tracking. Before starting, remember to check if your development environment meets the following requirements.

Device:

Development software:

  • Unity Editor

VIVE SDK:

  • VIVE OpenXR Plugin : 1.0.1 or later

(To check your VIVE SDK version, go to How to Install VIVE OpenXR Plugin )

See Your Hand In XR

In this chapter, I am going to teach you how to use VIVE OpenXR Hand Tracking, for I believe it is simpler, more efficient and covers most user scenarios. However, if you’re interested in learning the API design of the original plug-in, go check API_Reference

VIVE OpenXR Hand Tracking, defines 26 joints of each hand as seen below.

image1_LbSYZOc.width-500.png

Each joint contains useful information, such as tracking status, position, and rotation.

Step 1. Check your VIVE OpenXR Plugin package version Go to Window > Package Manager, the VIVE OpenXR Plugin version should be 2.0.0 or newer.

VIVEOpenXRPlugin200.png

Step 2. Enable the Hand Tracking feature Go to Project Settings > XR Plug-In Management > OpenXR and enable VIVE XR Hand Tracking.

XRPluginManagementHandTracking.png

Step 3: Create a Rig
3.1 In the editor, create a basic rig like this.

HandTrackingCreateRig.png
Remember to attach the TrackedPoseDriver onto the Head, as mentioned in the previous chapter.

3.2 Create two empty GameObjects named LeftHand and RightHand under Rig.

HandTrackingCreateObject.png

The hand joints will be placed in these two GameObjects.

Step 4. Create a Joint prefab
The joint prefab represents the pose of a joint.

First, create a script called Joint_Movement.cs.

In Joint_Movement.cs, first, we add two namespaces.

//---Script
using VIVE.OpenXR;
using VIVE.OpenXR.Hand;
//---Script

These two namespaces allow us to use the VIVE OpenXR Hand Tracking.

The two variables, jointNum and isLeft, tell the script which joint of which hand it should be tracking.

//---Script
public int jointNum;
public bool isLeft;
//---Script

In Update(), we’ll retrieve the joint poses from XR_EXT_hand_tracking.Interop.GetJointLocations().

//---Script
            if (!XR_EXT_hand_tracking.Interop.GetJointLocations(isLeft, out XrHandJointLocationEXT[] handJointLocation)) { return; }
//---Script

Full script:

//---Script
// Copyright HTC Corporation All Rights Reserved.

using System;
using System.Collections.Generic;
using UnityEngine;
using VIVE.OpenXR.Hand;

namespace VIVE.OpenXR.Samples.Hand
{
    public class Joint_Movement : MonoBehaviour
    {
        public int jointNum = 0;
        public bool isLeft = false;
        [SerializeField] List<GameObject> Childs = new List<GameObject>();

        private Vector3 jointPos = Vector3.zero;
        private Quaternion jointRot = Quaternion.identity;
        public static void GetVectorFromOpenXR(XrVector3f xrVec3, out Vector3 vec)
        {
            vec.x = xrVec3.x;
            vec.y = xrVec3.y;
            vec.z = -xrVec3.z;
        }
        public static void GetQuaternionFromOpenXR(XrQuaternionf xrQuat, out Quaternion qua)
        {
            qua.x = xrQuat.x;
            qua.y = xrQuat.y;
            qua.z = -xrQuat.z;
            qua.w = -xrQuat.w;
        }

        void Update()
        {
            if (!XR_EXT_hand_tracking.Interop.GetJointLocations(isLeft, out XrHandJointLocationEXT[] handJointLocation)) { return; }

            bool poseTracked = false;

            if (((UInt64)handJointLocation[jointNum].locationFlags & (UInt64)XrSpaceLocationFlags.XR_SPACE_LOCATION_ORIENTATION_TRACKED_BIT) != 0)
            {
                GetQuaternionFromOpenXR(handJointLocation[jointNum].pose.orientation, out jointRot);
                transform.rotation = jointRot;
                poseTracked = true;
            }
            if (((UInt64)handJointLocation[jointNum].locationFlags & (UInt64)XrSpaceLocationFlags.XR_SPACE_LOCATION_POSITION_TRACKED_BIT) != 0)
            {
                GetVectorFromOpenXR(handJointLocation[jointNum].pose.position, out jointPos);
                transform.localPosition = jointPos;
                poseTracked = true;
            }

            ActiveChilds(poseTracked);
        }

        void ActiveChilds(bool _SetActive)
        {
            for (int i = 0; i < Childs.Count; i++)
            {
                Childs[i].SetActive(_SetActive);
            }
        }
    }
}
//---Script

Second, create a simple prefab (a simple ball or cube) named Joint , and then attach the Joint_Movement script to it.

HandTrackingJoint.png

Step 5. Generate the Hand
We’ll use the Joint prefab created in step 4 to generate the hand model. Create a second script and name it Show_Hand.cs.

//---Script
using UnityEngine;

namespace VIVE.OpenXR.Samples.Hand
{
    public class Show_Hand : MonoBehaviour
    {
        public GameObject jointPrefab;
        public Transform leftHand;
        public Transform rightHand;
        void Start()
        {
            GameObject temp;
            for (int i = 0; i < 26; i++)
            {
                temp = Instantiate(jointPrefab, leftHand);
                temp.GetComponent<Joint_Movement>().isLeft = true;
                temp.GetComponent<Joint_Movement>().jointNum = i;
            }

            for (int i = 0; i < 26; i++)
            {
                temp = Instantiate(jointPrefab, rightHand);
                temp.GetComponent<Joint_Movement>().isLeft = false;
                temp.GetComponent<Joint_Movement>().jointNum = i;
            }
        }
    }
}

//---Script 

Then, attach this script to any GameObject in the scene and remember to assign the variables in the Inspector window.

ShowHand.png

In this script, what we’re doing is simply spawning 26 joints for both hands.

Now, we are ready to go. Let’s build and run this app and see your hands in the XR world.

For a more detailed tutorial, watch this video: Hand Tracking Tutorial Video

Interact with Objects Remotely

While as I said previously, we all want the player to have a more immersive experience in the XR World, we can’t ignore the fact that controllers offer useful features to the player, such as interacting with objects that are not directly in front of them by using Raycasting.

Here, I’d like to introduce you to a simple but helpful feature in VIVE XR Hand Interaction. Hand Interaction allows you to help the player interact with objects remotely with the aid of Raycasting.

Basically, what the Hand Interaction provides is simply a position and a rotation. Aside from those, it also provides a value that indicates whether the player is pinching (selecting an object) or not.

The direction of ray uses the “forward” of pointerPose.rotation.

(Hand Interaction provides useful data, such as the pose and selectValue)

As you can imagine, with the data that Hand Interaction provides, we’re able to use Raycasting to select or interact with remote objects in XR.

This can be useful when the player is using hand tracking and needs to interact with other objects at the same time, such as scrolling or selecting an item on a panel that is not directly in front of the player.

Using the VIVE Hand Interaction Profile

Hello, in this section, I’d like to show you how to use the VIVE XR Hand Interaction profile. This profile is a simple but useful tool that provides data when the player forms their fingers into a pinch-like gesture, as shown below.

The direction of ray uses the “forward” of pointerPose.rotation.

This profile can be used in various scenarios; I listed some of these scenarios at the end of this page. In the following steps, let’s see how to use this profile exactly.

Step 1. Add the VIVE Hand Interaction Profile To use the Hand Interaction feature, simply add the VIVE Hand Interaction profile by going to Edit > Project Setting > XR Plug-In Management > OpenXR

XRPluginManagementHandInteraction.png

Step 2. Use the feature through an action map You can use the VIVE XR Hand Interaction profile in any action map.

Here are some data provided by VIVE XR Hand Interaction.

  1. selectValue tells how strong the player is pinching.
    HandTrackingSetValue.png

  2. pointerPose returns the pose of the pinch.
    HandTrackingPointPose.png

If you are not familiar with how to use the action maps, go check the basic input. By using the Hand Interaction feature you can interact with objects remotely in the XR world.