SixenseVR is a cross-platform API that uses position and orientation data from motion sensors attached to the head and hands to provide a full-body skeletal pose for the user’s avatar. Using data from the sensor attached to the head, we can position each of hands accurately within the user’s field of vision, ensuring near-perfect hand-eye coordination.
Full-Body Presence


Adds full-body presence to any VR application, using Inverse Kinematics and Animation Blending

Cross-platform C API
Compatible with any HMD


Works with any HMD including the Oculus Rift, and up to five STEM trackers or Razer Hydra

Integrated into Unity and Unreal Engine 4


Fully integrated into Unity and Unreal Engine 4, with code-free setup for any humanoid character




SixenseVR supports any number of trackers, which can be placed on any part of the body. The head position tracking can come from a STEM Pack tracker, the HMD itself, or a fusion of the two for a calibration free experience. Even with only HMD tracking, SixenseVR will match the pose of the avatar to the position and orientation of the head, providing body awareness with realistic-looking leaning and crouching.
 
Poses generated by SixenseVR take existing game animations as a starting point, then use up to five tracking points on the body with our custom Inverse Kinematics solver to position each joint in a plausible, natural feeling position that matches the user’s body as closely as possible. Parts of the body that do not have sensors will fall back smoothly to game animations, and individual limbs can be blended in and out of our IK for locomotion and other scripted behavior.
 
At launch, SixenseVR will support the STEM System and the Razer Hydra motion controllers. For future versions we will work together with other hardware manufacturers to incorporate new motion tracking technology, providing a common API for full-body tracking in virtual reality.
 
Our Unity integration wraps this up in a designer-friendly visual tool, that accurately maps the bones of any Mecanim-rigged character to our STEM Controllers and STEM Packs, with no coding required. This is simply done by positioning virtual 3D models of STEM Controllers in the character’s hands, and a virtual 3D model of an HMD on the character’s face in the Unity editor. This makes it easy to guarantee that the virtual head and hands are always in the exact same positions as in the real world.
 
Other engines such as Unreal Engine 4 and the Source SDK will also be supported.
  • Hands-On with Sixense STEM

    "The inventors of Razer's Hydra controller technology have a new motion-tracking system..."    Tested

  • MakeVR will let you design a 3D object with your hands

    "You may soon no longer need to master complicated design software to dive into 3D printing."    VentureBeat

  • MakeVR: a virtual space to quickly produce 3D designs

    "When a design is complete, it can be sent directly to a 3D printer without any post-processing required."    GigaOm

  • MakeVR Wants to Make 3D Design Easy

    "The software’s newest feature is its most impressive. Sixense will introduce collaborative modeling..."    Popular Mechanics

  • Hands-on with Sixense’s MakeVR

    "Can't find the perfect gift for your grandmother's birthday?
    Just make it."    Engadget

MEDIAmore...
  • Introducing the STEM System

    A motion tracking system for the most intuitive interaction with video games, virtual reality, and more.

  • Loading Human with STEM System

    Preview of Untold Games’s Loading Human played with our STEM System. We were impressed with the early integration!

  • Sixense STEM Hands-On

    Engadget tries out the Sixense STEM prototype with Sixense Creative Director Danny Woodall        

  • STEM Five-Tracker Demo

    Danny Woodall uses five wireless prototype trackers to control a humanoid avatar…                  

  • SixenseVR SDK

    Check out this preview of SixenseVR, streamlining your VR integration into Unity and Unreal Engine 4!