Docs. (VIVE Sense)

Intro to VIVE SRWorks SDK


VIVE SRWorks SDK (Early Access)

With the launch of VIVE Pro, developers will now have access to the stereo front facing cameras to create new experiences that can mix the see-through stereo camera view and their virtual worlds. This will enable developers to perform 3D perception and depth sensing with the stereo RGB sensors, opening new worlds for more creative, interactive experiences.

In addition to the updated OpenVR camera APIs that can now handle more than the mono camera of the original VIVE, the VIVE Software team is also providing developers with early access to the VIVE SRWorks SDK.

With this SDK you can access more than just the raw camera images:

  • Depth
  • Spatial Mapping (static and dynamic meshes)
  • Placing virtual objects in the foreground or background
  • Live interactions with virtual objects and simple hand interactions
  • AI Vision module for semantic segmentation.

These features are provided by three service modules, a Depth module, a See-through module, a 3D reconstruction module and a AI Vision module thus allowing developers to focus on the content.

The SDK includes support for native development with plugins for Unity and Unreal.

Also included is a sample chaperone: a human detection chaperone running in the background as an overlay.

The following videos illustrate these features.

Interaction and Occlusion, ex. 1

Interaction and Occlusion, ex 2

Spatial Scan

Visual Effects

A portal between the real and virtual world

The project code for the portal example is included in the VIVE SRWorks SDK Unity packages.

Below are examples from two of our early access developers.

Example 3D semantic segmentation


Example Chaperone: a human detection chaperone running in the background as an overlay