Adds full-body presence to any VR application, using Inverse Kinematics and Animation Blending
Works with any HMD including the Oculus Rift, and up to five STEM trackers or Razer Hydra
Fully integrated into Unity and Unreal Engine 4, with code-free setup for any humanoid character
SixenseVR supports any number of trackers, which can be placed on any part of the body. The head position tracking can come from a STEM Pack tracker, the HMD itself, or a fusion of the two for a calibration free experience. Even with only HMD tracking, SixenseVR will match the pose of the avatar to the position and orientation of the head, providing body awareness with realistic-looking leaning and crouching.
Poses generated by SixenseVR take existing game animations as a starting point, then use up to five tracking points on the body with our custom Inverse Kinematics solver to position each joint in a plausible, natural feeling position that matches the user’s body as closely as possible. Parts of the body that do not have sensors will fall back smoothly to game animations, and individual limbs can be blended in and out of our IK for locomotion and other scripted behavior.
At launch, SixenseVR will support the STEM System and the Razer Hydra motion controllers. For future versions we will work together with other hardware manufacturers to incorporate new motion tracking technology, providing a common API for full-body tracking in virtual reality.
Our Unity integration wraps this up in a designer-friendly visual tool, that accurately maps the bones of any Mecanim-rigged character to our STEM Controllers and STEM Packs, with no coding required. This is simply done by positioning virtual 3D models of STEM Controllers in the character’s hands, and a virtual 3D model of an HMD on the character’s face in the Unity editor. This makes it easy to guarantee that the virtual head and hands are always in the exact same positions as in the real world.
Other engines such as Unreal Engine 4 and the Source SDK will also be supported.