Hand Tracking
⚠ NOTICE |
This legacy plugin is no longer being updated and maintained, please develop mobile content with OpenXR 2-in-1 Unity / Unreal package . |
Overview
In previous chapters, we discussed how to use the controller as the input to interact with objects in XR. However, as a developer, I think it is important for us to create contents as immersive as possible. I mean, wasn’t it the first reason that drove us to the field of XR? — To create content which the player can interact with in person, instead of indirectly via an avatar like they do in other media, such as PCs or consoles.
In this chapter, I will show you how we can create more immersive experiences by using the Hand Tracking feature. Currently, in VIVE OpenXR HandTracking extension, the SDK provides the following features:
- Hand Tracking : Enable applications to locate the individual joints of hand tracking inputs and render hands.
- Hand Interaction : Enable applications to aim at virtual objects and determine if a player is selecting something based on the player’s finger pinch strength.
Development Environment Requirements
Currently, the Unity Editor doesn’t provide a default Hand Tracking interface. Therefore, in this chapter, we’ll be using the VIVE OpenXR HandTracking extension to use Hand Tracking. Before starting, remember to check if your development environment meets the following requirements.
Device:
VIVE Focus 3
Development software:
Unity Editor
VIVE SDK:
VIVE Wave XR Plugin - OpenXR : 1.0.1 or later
VIVE Wave XR Toolkit - OpenXR : 1.0.1 or later
(To check your VIVE SDK version, go to
[How to Install VIVE Wave OpenXR Plugin]
)