Please Select Your Location
Australia
Österreich
België
Canada
Canada - Français
中国
Česká republika
Denmark
Deutschland
France
HongKong
Iceland
Italia
日本
Korea
Latvija
Lietuva
Lëtzebuerg
Malta
المملكة العربية السعودية (Arabic)
Nederland
New Zealand
Norge
Polska
Portugal
Russia
Saudi Arabia
Southeast Asia
España
Suisse
Suomi
Sverige
台灣
Ukraine
United Kingdom
United States

Hand Tracking

Overview

In previous chapters, we discussed how to use the controller as the input to interact with objects in XR. However, as a developer, I think it is important for us to create contents as immersive as possible. I mean, wasn’t it the first reason that drove us to the field of XR? — To create content which the player can interact with in person, instead of indirectly via an avatar like they do in other media, such as PCs or consoles.

In this chapter, I will show you how we can create more immersive experiences by using the Hand Tracking feature. Currently, in VIVE OpenXR HandTracking extension, the SDK provides the following features:

  • Hand Tracking : Enable applications to locate the individual joints of hand tracking inputs and render hands.
  • Hand Interaction : Enable applications to aim at virtual objects and determine if a player is selecting something based on the player’s finger pinch strength.

Development Environment Requirements

Currently, the Unity Editor doesn’t provide a default Hand Tracking interface. Therefore, in this chapter, we’ll be using the VIVE OpenXR HandTracking extension to use Hand Tracking. Before starting, remember to check if your development environment meets the following requirements.

Device:

VIVE Focus 3

Development software:

Unity Editor

VIVE SDK:

VIVE Wave XR Plugin - OpenXR : 1.0.1 or later

VIVE Wave XR Toolkit - OpenXR : 1.0.1 or later

(To check your VIVE SDK version, go to [Link: to package importing] )