SixenseVR is a cross-platform API that uses position and orientation data from motion sensors attached to the head and hands to provide a full-body skeletal pose for the user’s avatar. Using data from the sensor attached to the head, we can position each of hands accurately within the user’s field of vision, ensuring near-perfect hand-eye coordination.
Full-Body Presence


Adds full-body presence to any VR application, using Inverse Kinematics and Animation Blending

Cross-platform C API
Compatible with any HMD


Works with any HMD including the Oculus Rift, and up to five STEM trackers or Razer Hydra

Integrated into Unity and Unreal Engine 4


Fully integrated into Unity and Unreal Engine 4, with code-free setup for any humanoid character




SixenseVR supports any number of trackers, which can be placed on any part of the body. The head position tracking can come from a STEM Pack tracker, the HMD itself, or a fusion of the two for a calibration free experience. Even with only HMD tracking, SixenseVR will match the pose of the avatar to the position and orientation of the head, providing body awareness with realistic-looking leaning and crouching.
 
Poses generated by SixenseVR take existing game animations as a starting point, then use up to five tracking points on the body with our custom Inverse Kinematics solver to position each joint in a plausible, natural feeling position that matches the user’s body as closely as possible. Parts of the body that do not have sensors will fall back smoothly to game animations, and individual limbs can be blended in and out of our IK for locomotion and other scripted behavior.
 
At launch, SixenseVR will support the STEM System and the Razer Hydra motion controllers. For future versions we will work together with other hardware manufacturers to incorporate new motion tracking technology, providing a common API for full-body tracking in virtual reality.
 
Our Unity integration wraps this up in a designer-friendly visual tool, that accurately maps the bones of any Mecanim-rigged character to our STEM Controllers and STEM Packs, with no coding required. This is simply done by positioning virtual 3D models of STEM Controllers in the character’s hands, and a virtual 3D model of an HMD on the character’s face in the Unity editor. This makes it easy to guarantee that the virtual head and hands are always in the exact same positions as in the real world.
 
Other engines such as Unreal Engine 4 and the Source SDK will also be supported.
  • Motion Controllers Make Great Virtual Reality Lightsabers

    "It's one of the most obviously fun, relatable things to do with motion controllers, even for people who are otherwise skeptical of VR."    The Verge

  • If You Want To Wield A Lightsaber, Sixense’s STEM System Could Be The Closest You’ll Ever Get

    "Yes, they're awesome."    Inventor Spot

  • Oculus Rift and STEMs make the ultimate lightsaber game possible

    "...we still don’t have a decent lightsaber game to play. That is set to change, though, through the combination of Oculus Rift and Sixense’s STEM System."    Geek

  • How to Become a Virtual Jedi with STEMs

    "Sixense’s STEMs are looking like a must have Oculus Rift companion..."    HITC Tech

  • Incredible Lightsaber and Gun Interactions

    "I am incredibly excited about the near-future of VR after seeing these three new videos from Sixense..."    Road to VR

MEDIAmore...
  • Awesome VR Lightsabers

    The lightsaber or sword is the true test for a one-to-one motion tracking system in VR…                      

  • Advanced Weapon Mechanics

    An example of common game interactions with a focus on full-simulation weapon mechanics.                      

  • Easy-to-build VR Interactions

    We showcase interactions, easily reproducible through the SixenseVR SDK.                                            

  • Introducing the STEM System

    A motion tracking system for the most intuitive interaction with video games, virtual reality, and more.

  • Loading Human with STEM System

    Preview of Untold Games’s Loading Human played with our STEM System. We were impressed with the early integration!