-
VIVE Wave
-
SDK
- What is VIVE Wave?
- Where can I download the Wave SDK?
- How do I access the Wave Developer Community forum?
- How do I fix compile or build errors after importing a custom package?
- How do I put controllers into a scene?
- How do I switch heads with different tracking pose settings in the same scene?
- How do I switch between 6DoF and 3DoF in the same scene?
- Why can't I receive events or click on the system overlay?
- How do I get controller objects in runtime?
- How do I show or hide the controller pointer?
- How do I change the controller model object from Root to Emitter?
- How do I know if a controller sends an event?
- How do I set the controller position in a rotation-only environment?
- How do I display only one controller in a scene?
- How do I find out the distance of the head from the floor?
-
Distributing your app
-
-
VIVE SRWorks
-
VIVE Eye and Facial Tracking
- Where can I download the VIVE Eye and Facial Tracking SDK and runtime?
- How do I calibrate eye tracking?
- What drivers are required for VIVE Eye and Facial Tracking?
- How do I update the VIVE Eye and Facial Tracking runtime?
- How do I update the VIVE Pro Eye firmware?
- When starting my app, why does Windows prompt for permission to make changes?
- The eye camera version shows "N/A". Is that a problem?
- If I encountered other eye tracking issues, what should I do?
- I encountered eye calibration initialization and OpenCL errors. What should I do?
- Can eye tracking work when using the VIVE Wireless Adapter?
- If the user has visual impairment, what happens to the calibration data?
- What's the trackable FOV? How about the tracking accuracy?
-
VIVE Hand Tracking
- Does the Hand Tracking SDK support VIVE Cosmos?
- Are there prebuilt samples available, and where can I find them?
- My Unity build throws this exception: "DllNotFoundException: aristo_interface". What should I do?
- I encountered this error: "Start Detection Error: Camera". What should I do?
- I encountered this error: "Start Detection Error: OpenCL". What should I do?
- Is there a way to improve hand detection accuracy?
-
VIVE 3DSP Audio
- Are there tutorials for the VIVE 3DSP Audio SDK?
- How do I know if there's a public release for the SDK?
- How do I quickly enable VIVE 3DSP Audio SDK to add enhanced support for 3D sound effects in Unity?
- Does 3DSP Audio SDK have any hardware dependencies?
- Can VIVE 3DSP Audio SDK be used in conjunction with other spatialization SDKs?
-
VIVEPORT
-
VIVE Business Streaming
-
Facial tracking
-
Hand tracking
-
VIVE Wrist Tracker
- How do I use VIVE Wrist Tracker with VIVE Focus 3 for PC VR apps?
- How do I set VIVE Wrist Tracker options?
- How do I use VIVE Wrist Tracker as VIVE Tracker?
- How do I use VIVE Wrist Tracker for hand tracking?
- How do I use VIVE Wrist Tracker as VIVE Tracker for Native or Unity?
- How do I use VIVE Wrist Tracker as VIVE Tracker for Unreal Engine?
-
How do I switch between 6DoF and 3DoF in the same scene?
Headset and controller
Vector3(0, 0, 0). If you switch to 3DoF, the headset and controllers are always fixed at the origin.
While in VR, if there are no other manipulations to the headset and controllers’ GameObject, you will probably see the controller’s model come out from your head and your head is on the floor. Therefore, fixed transform is still needed to keep a distance between the controllers and floor.
Pose Tracker
controllerGameObject.GetComponent<WaveVR_PoseTracker>().trackPosition = false;
Set trackPosition to true to switch to 6DoF mode.
Render
For 3DoF, set WaveVR's render.origin to WVR_PoseOriginModel.WVR_PoseOriginModel_OriginOnHead_3DoF. For details, see the SDK API’s description.
var render = WaveVR_Render.Instance; render.origin = WVR_PoseOriginModel.WVR_PoseOriginModel_OriginOnHead_3DoF;
Eye to head transform
For example, if your left eye sees an apple in front at position (0, 0, 1), the actual location of the apple in the world is:
HeadPose * EyeToHeadTransform * (0,0,1)
Assuming EyeToHeadTransform is:
| 1 0 0 -0.03 | | 0 1 0 0 | | 0 0 1 0 | | 0 0 0 1 |
and the headset's transform is:
| 1 0 0 0 | | 0 1 0 1.8 | | 0 0 1 0 | | 0 0 0 1 |
- (-0.03, 0, 1) in the head space
- (-0.03, 1.8, 1) in the world space
The apple's position can also be known from the world space and eye space by the inversed transforms.
EyeToHeadTransform^-1 * HeadPose^-1 * (-0.03, 1.8, 1)
The EyeToHeadTransform may have different values in 6DoF and 3DoF.
left eye in 6DoF | 1 0 0 -0.03 | | 0 1 0 0.05 | | 0 0 1 0.05 | | 0 0 0 1 | left eye in 3DoF | 1 0 0 -0.03 | | 0 1 0 0 | | 0 0 1 -0.15 | | 0 0 0 1 |
The 6DoF transform values change according to your headset. For 3DoF, the transform values are based on your head's rotation center where z is usually -0.15 for better a user experience.