mixed-reality

Head-gaze in Unity

Gaze is the primary way for users to target holograms your app creates in Mixed Reality.

Implementing head-gaze

Conceptually, you determine head-gaze by projecting a ray forward from the user’s headset to see what it hits. In Unity, the user’s head position and direction are exposed through the Camera, specifically UnityEngine.Camera.main.transform.forward and UnityEngine.Camera.main.transform.position.

Calling Physics.RayCast gives you a RaycastHit containing information about the collision, including the 3D collision point and the other GameObject the head-gaze ray hit.

Example: Implement head-gaze

void Update()
{
       RaycastHit hitInfo;
       if (Physics.Raycast(
               Camera.main.transform.position,
               Camera.main.transform.forward,
               out hitInfo,
               20.0f,
               Physics.DefaultRaycastLayers))
       {
           // If the Raycast has succeeded and hit a hologram
           // hitInfo's point represents the position being gazed at
           // hitInfo's collider GameObject represents the hologram being gazed at
       }
}

Best practices

While the example above fires a single raycast from the update loop to find the target the user’s head points at, we recommended using a single object to manage all head-gaze processes. Combining your head-gaze logic will save your app precious processing power and limit your raycasting to one per frame.

Visualizing head-gaze

Just like with a mouse pointer on a computer, you should implement a cursor that represents the user’s head-gaze. Knowing what content a user is targeting increases confidence in what they’re about to interact with.

Head-gaze in the Mixed Reality Toolkit

You can access head-gaze from the Input Manager in MRTK.

Next Development Checkpoint

If you’re following the Unity development journey we’ve laid out, you’re in the midst of exploring the MRTK core building blocks. From here, you can continue to the next building block:

[!div class=”nextstepaction”] Motion controllers

Or jump to Mixed Reality platform capabilities and APIs:

[!div class=”nextstepaction”] Shared experiences

You can always go back to the Unity development checkpoints at any time.

See also