mixed-reality

HoloLens (1st gen) Input 211: Gesture

[!IMPORTANT] The Mixed Reality Academy tutorials were designed with HoloLens (1st gen), Unity 2017, and Mixed Reality Immersive Headsets in mind. As such, we feel it is important to leave these tutorials in place for developers who are still looking for guidance in developing for those devices. These tutorials will not be updated with the latest toolsets or interactions being used for HoloLens 2 and may not be compatible with newer versions of Unity. They will be maintained to continue working on the supported devices. A new series of tutorials has been posted for HoloLens 2.

Gestures turn user intention into action. With gestures, users can interact with holograms. In this course, we’ll learn how to track the user’s hands, respond to user input, and give feedback to the user based on hand state and location.

[!VIDEO https://www.youtube.com/embed/c9zlpfFeEtc]

In MR Basics 101, we used a simple air-tap gesture to interact with our holograms. Now, we’ll move beyond the air-tap gesture and explore new concepts to:

In this course, we’ll revisit the Unity project Model Explorer, which we built in MR Input 210. Our astronaut friend is back to assist us in our exploration of these new gesture concepts.

[!IMPORTANT] The videos embedded in each of the chapters below were recorded using an older version of Unity and the Mixed Reality Toolkit. While the step-by-step instructions are accurate and current, you may see scripts and visuals in the corresponding videos that are out-of-date. The videos remain included for posterity and because the concepts covered still apply.

Device support

Course HoloLens Immersive headsets
MR Input 211: Gesture ✔️ ✔️

Before you start

Prerequisites

Project files

[!NOTE] If you want to look through the source code before downloading, it’s available on GitHub.

Errata and Notes

Chapter 0 - Unity Setup

Instructions

  1. Start Unity.
  2. Select Open.
  3. Navigate to the Gesture folder you previously un-archived.
  4. Find and select the Starting/Model Explorer folder.
  5. Click the Select Folder button.
  6. In the Project panel, expand the Scenes folder.
  7. Double-click ModelExplorer scene to load it in Unity.

Building

  1. In Unity, select File > Build Settings.
  2. If Scenes/ModelExplorer is not listed in Scenes In Build, click Add Open Scenes to add the scene.
  3. If you’re specifically developing for HoloLens, set Target device to HoloLens. Otherwise, leave it on Any device.
  4. Ensure Build Type is set to D3D and SDK is set to Latest installed (which should be SDK 16299 or newer).
  5. Click Build.
  6. Create a New Folder named “App”.
  7. Single click the App folder.
  8. Press Select Folder and Unity will start building the project for Visual Studio.

When Unity is done, a File Explorer window will appear.

  1. Open the App folder.
  2. Open the ModelExplorer Visual Studio Solution.

If deploying to HoloLens:

  1. Using the top toolbar in Visual Studio, change the target from Debug to Release and from ARM to x86.
  2. Click on the drop down arrow next to the Local Machine button, and select Remote Machine.
  3. Enter your HoloLens device IP address and set Authentication Mode to Universal (Unencrypted Protocol). Click Select. If you do not know your device IP address, look in Settings > Network & Internet > Advanced Options.
  4. In the top menu bar, click Debug -> Start Without debugging or press Ctrl + F5. If this is the first time deploying to your device, you will need to pair it with Visual Studio.
  5. When the app has deployed, dismiss the Fitbox with a select gesture.

If deploying to an immersive headset:

  1. Using the top toolbar in Visual Studio, change the target from Debug to Release and from ARM to x64.
  2. Make sure the deployment target is set to Local Machine.
  3. In the top menu bar, click Debug -> Start Without debugging or press Ctrl + F5.
  4. When the app has deployed, dismiss the Fitbox by pulling the trigger on a motion controller.

[!NOTE] You might notice some red errors in the Visual Studio Errors panel. It is safe to ignore them. Switch to the Output panel to view actual build progress. Errors in the Output panel will require you to make a fix (most often they are caused by a mistake in a script).

Chapter 1 - Hand detected feedback

[!VIDEO https://www.youtube.com/embed/D1FcIyuFTZQ]

Objectives

[!NOTE] On HoloLens 2 , hands detected fires whenever hands are visible (not just when a finger is pointing up).

Instructions

The InteractionInputSource.cs script performs these steps:

  1. Subscribes to the InteractionSourceDetected and InteractionSourceLost events.
  2. Sets the HandDetected state.
  3. Unsubscribes from the InteractionSourceDetected and InteractionSourceLost events.

Next, we’ll upgrade our cursor from MR Input 210 into one that shows feedback depending on the user’s actions.

  1. In the Hierarchy panel, select the Cursor object and delete it.
  2. In the Project panel, search for CursorWithFeedback and drag it into the Hierarchy panel.
  3. Click on InputManager in the Hierarchy panel, then drag the CursorWithFeedback object from the Hierarchy into the InputManager’s SimpleSinglePointerSelector’s Cursor field, at the bottom of the Inspector.
  4. Click on the CursorWithFeedback in the Hierarchy.
  5. In the Inspector panel, expand Cursor State Data on the Object Cursor script.

The Cursor State Data works like this:

Build and Deploy

Chapter 2 - Navigation

[!VIDEO https://www.youtube.com/embed/sm-kxtKksSo]

Objectives

Instructions

To use Navigation gestures in our app, we are going to edit GestureAction.cs to rotate objects when the Navigation gesture occurs. Additionally, we’ll add feedback to the cursor to display when Navigation is available.

  1. In the Hierarchy panel, expand CursorWithFeedback.
  2. In the Holograms folder, find the ScrollFeedback asset.
  3. Drag and drop the ScrollFeedback prefab onto the CursorWithFeedback GameObject in the Hierarchy.
  4. Click on CursorWithFeedback.
  5. In the Inspector panel, click the Add Component button.
  6. In the menu, type in the search box CursorFeedback. Select the search result.
  7. Drag and drop the ScrollFeedback object from the Hierarchy onto the Scroll Detected Game Object property in the Cursor Feedback component in the Inspector.
  8. In the Hierarchy panel, select the AstroMan object.
  9. In the Inspector panel, click the Add Component button.
  10. In the menu, type in the search box Gesture Action. Select the search result.

Next, open GestureAction.cs in Visual Studio. In coding exercise 2.c, edit the script to do the following:

  1. Rotate the AstroMan object whenever a Navigation gesture is performed.
  2. Calculate the rotationFactor to control the amount of rotation applied to the object.
  3. Rotate the object around the y-axis when the user moves their hand left or right.

Complete coding exercises 2.c in the script, or replace the code with the completed solution below:

using HoloToolkit.Unity.InputModule;
using UnityEngine;

/// <summary>
/// GestureAction performs custom actions based on
/// which gesture is being performed.
/// </summary>
public class GestureAction : MonoBehaviour, INavigationHandler, IManipulationHandler, ISpeechHandler
{
    [Tooltip("Rotation max speed controls amount of rotation.")]
    [SerializeField]
    private float RotationSensitivity = 10.0f;

    private bool isNavigationEnabled = true;
    public bool IsNavigationEnabled
    {
        get { return isNavigationEnabled; }
        set { isNavigationEnabled = value; }
    }

    private Vector3 manipulationOriginalPosition = Vector3.zero;

    void INavigationHandler.OnNavigationStarted(NavigationEventData eventData)
    {
        InputManager.Instance.PushModalInputHandler(gameObject);
    }

    void INavigationHandler.OnNavigationUpdated(NavigationEventData eventData)
    {
        if (isNavigationEnabled)
        {
            /* TODO: DEVELOPER CODING EXERCISE 2.c */

            // 2.c: Calculate a float rotationFactor based on eventData's NormalizedOffset.x multiplied by RotationSensitivity.
            // This will help control the amount of rotation.
            float rotationFactor = eventData.NormalizedOffset.x * RotationSensitivity;

            // 2.c: transform.Rotate around the Y axis using rotationFactor.
            transform.Rotate(new Vector3(0, -1 * rotationFactor, 0));
        }
    }

    void INavigationHandler.OnNavigationCompleted(NavigationEventData eventData)
    {
        InputManager.Instance.PopModalInputHandler();
    }

    void INavigationHandler.OnNavigationCanceled(NavigationEventData eventData)
    {
        InputManager.Instance.PopModalInputHandler();
    }

    void IManipulationHandler.OnManipulationStarted(ManipulationEventData eventData)
    {
        if (!isNavigationEnabled)
        {
            InputManager.Instance.PushModalInputHandler(gameObject);

            manipulationOriginalPosition = transform.position;
        }
    }

    void IManipulationHandler.OnManipulationUpdated(ManipulationEventData eventData)
    {
        if (!isNavigationEnabled)
        {
            /* TODO: DEVELOPER CODING EXERCISE 4.a */

            // 4.a: Make this transform's position be the manipulationOriginalPosition + eventData.CumulativeDelta
        }
    }

    void IManipulationHandler.OnManipulationCompleted(ManipulationEventData eventData)
    {
        InputManager.Instance.PopModalInputHandler();
    }

    void IManipulationHandler.OnManipulationCanceled(ManipulationEventData eventData)
    {
        InputManager.Instance.PopModalInputHandler();
    }

    void ISpeechHandler.OnSpeechKeywordRecognized(SpeechEventData eventData)
    {
        if (eventData.RecognizedText.Equals("Move Astronaut"))
        {
            isNavigationEnabled = false;
        }
        else if (eventData.RecognizedText.Equals("Rotate Astronaut"))
        {
            isNavigationEnabled = true;
        }
        else
        {
            return;
        }

        eventData.Use();
    }
}

You’ll notice that the other navigation events are already filled in with some info. We push the GameObject onto the Toolkit’s InputSystem’s modal stack, so the user doesn’t have to maintain focus on the Astronaut once rotation has begun. Correspondingly, we pop the GameObject off the stack once the gesture is completed.

Build and Deploy

  1. Rebuild the application in Unity and then build and deploy from Visual Studio to run it in the HoloLens.
  2. Gaze at the astronaut, two arrows should appear on either side of the cursor. This new visual indicates that the astronaut can be rotated.
  3. Place your hand in the ready position (index finger pointed towards the sky) so the HoloLens will start tracking your hand.
  4. To rotate the astronaut, lower your index finger to a pinch position, and then move your hand left or right to trigger the NavigationX gesture.

Chapter 3 - Hand Guidance

[!VIDEO https://www.youtube.com/embed/ULzlVw4e14I]

Objectives

Instructions

  1. In the Hierarchy panel, select the CursorWithFeedback object.
  2. In the Inspector panel, click the Add Component button.
  3. In the menu, type in the search box Hand Guidance. Select the search result.
  4. In the Project panel Holograms folder, find the HandGuidanceFeedback asset.
  5. Drag and drop the HandGuidanceFeedback asset onto the Hand Guidance Indicator property in the Inspector panel.

Build and Deploy

Chapter 4 - Manipulation

[!VIDEO https://www.youtube.com/embed/f3m8MvU60-I]

Objectives

Instructions

GestureManager.cs and AstronautManager.cs will allow us to do the following:

  1. Use the speech keyword “Move Astronaut” to enable Manipulation gestures and “Rotate Astronaut” to disable them.
  2. Switch to responding to the Manipulation Gesture Recognizer.

Let’s get started.

  1. In the Hierarchy panel, create a new empty GameObject. Name it “AstronautManager”.
  2. In the Inspector panel, click the Add Component button.
  3. In the menu, type in the search box Astronaut Manager. Select the search result.
  4. In the Inspector panel, click the Add Component button.
  5. In the menu, type in the search box Speech Input Source. Select the search result.

We’ll now add the speech commands required to control the interaction state of the astronaut.

  1. Expand the Keywords section in the Inspector.
  2. Click the + on the right hand side to add a new keyword.
  3. Type the Keyword as Move Astronaut. Feel free to add a Key Shortcut if desired.
  4. Click the + on the right hand side to add a new keyword.
  5. Type the Keyword as Rotate Astronaut. Feel free to add a Key Shortcut if desired.
  6. The corresponding handler code can be found in GestureAction.cs, in the ISpeechHandler.OnSpeechKeywordRecognized handler.

How to set-up the Speech Input Source for chapter 4

Next, we’ll setup the manipulation feedback on the cursor.

  1. In the Project panel Holograms folder, find the PathingFeedback asset.
  2. Drag and drop the PathingFeedback prefab onto the CursorWithFeedback object in the Hierarchy.
  3. In the Hierarchy panel, click on CursorWithFeedback.
  4. Drag and drop the PathingFeedback object from the Hierarchy onto the Pathing Detected Game Object property in the Cursor Feedback component in the Inspector.

Now we need to add code to GestureAction.cs to enable the following:

  1. Add code to the IManipulationHandler.OnManipulationUpdated function, which will move the astronaut when a Manipulation gesture is detected.
  2. Calculate the movement vector to determine where the astronaut should be moved to based on hand position.
  3. Move the astronaut to the new position.

Complete coding exercise 4.a in GestureAction.cs, or use our completed solution below:

using HoloToolkit.Unity.InputModule;
using UnityEngine;

/// <summary>
/// GestureAction performs custom actions based on
/// which gesture is being performed.
/// </summary>
public class GestureAction : MonoBehaviour, INavigationHandler, IManipulationHandler, ISpeechHandler
{
    [Tooltip("Rotation max speed controls amount of rotation.")]
    [SerializeField]
    private float RotationSensitivity = 10.0f;

    private bool isNavigationEnabled = true;
    public bool IsNavigationEnabled
    {
        get { return isNavigationEnabled; }
        set { isNavigationEnabled = value; }
    }

    private Vector3 manipulationOriginalPosition = Vector3.zero;

    void INavigationHandler.OnNavigationStarted(NavigationEventData eventData)
    {
        InputManager.Instance.PushModalInputHandler(gameObject);
    }

    void INavigationHandler.OnNavigationUpdated(NavigationEventData eventData)
    {
        if (isNavigationEnabled)
        {
            /* TODO: DEVELOPER CODING EXERCISE 2.c */

            // 2.c: Calculate a float rotationFactor based on eventData's NormalizedOffset.x multiplied by RotationSensitivity.
            // This will help control the amount of rotation.
            float rotationFactor = eventData.NormalizedOffset.x * RotationSensitivity;

            // 2.c: transform.Rotate around the Y axis using rotationFactor.
            transform.Rotate(new Vector3(0, -1 * rotationFactor, 0));
        }
    }

    void INavigationHandler.OnNavigationCompleted(NavigationEventData eventData)
    {
        InputManager.Instance.PopModalInputHandler();
    }

    void INavigationHandler.OnNavigationCanceled(NavigationEventData eventData)
    {
        InputManager.Instance.PopModalInputHandler();
    }

    void IManipulationHandler.OnManipulationStarted(ManipulationEventData eventData)
    {
        if (!isNavigationEnabled)
        {
            InputManager.Instance.PushModalInputHandler(gameObject);

            manipulationOriginalPosition = transform.position;
        }
    }

    void IManipulationHandler.OnManipulationUpdated(ManipulationEventData eventData)
    {
        if (!isNavigationEnabled)
        {
            /* TODO: DEVELOPER CODING EXERCISE 4.a */

            // 4.a: Make this transform's position be the manipulationOriginalPosition + eventData.CumulativeDelta
            transform.position = manipulationOriginalPosition + eventData.CumulativeDelta;
        }
    }

    void IManipulationHandler.OnManipulationCompleted(ManipulationEventData eventData)
    {
        InputManager.Instance.PopModalInputHandler();
    }

    void IManipulationHandler.OnManipulationCanceled(ManipulationEventData eventData)
    {
        InputManager.Instance.PopModalInputHandler();
    }

    void ISpeechHandler.OnSpeechKeywordRecognized(SpeechEventData eventData)
    {
        if (eventData.RecognizedText.Equals("Move Astronaut"))
        {
            isNavigationEnabled = false;
        }
        else if (eventData.RecognizedText.Equals("Rotate Astronaut"))
        {
            isNavigationEnabled = true;
        }
        else
        {
            return;
        }

        eventData.Use();
    }
}

Build and Deploy

Chapter 5 - Model expansion

[!VIDEO https://www.youtube.com/embed/dA11P4P0VO8]

Objectives

Instructions

In this section, we will accomplish the following tasks:

  1. Add a new keyword “Expand Model” to expand the astronaut model.
  2. Add a new Keyword “Reset Model” to return the model to its original form.

We’ll do this by adding two more keywords to the Speech Input Source from the previous chapter. We’ll also demonstrate another way to handle recognition events.

  1. Click back on AstronautManager in the Inspector and expand the Keywords section in the Inspector.
  2. Click the + on the right hand side to add a new keyword.
  3. Type the Keyword as Expand Model. Feel free to add a Key Shortcut if desired.
  4. Click the + on the right hand side to add a new keyword.
  5. Type the Keyword as Reset Model. Feel free to add a Key Shortcut if desired.
  6. In the Inspector panel, click the Add Component button.
  7. In the menu, type in the search box Speech Input Handler. Select the search result.
  8. Check Is Global Listener, since we want these commands to work regardless of the GameObject we’re focusing.
  9. Click the + button and select Expand Model from the Keyword dropdown.
  10. Click the + under Response, and drag the AstronautManager from the Hierarchy into the None (Object) field.
  11. Now, click the No Function dropdown, select AstronautManager, then ExpandModelCommand.
  12. Click the Speech Input Handler’s + button and select Reset Model from the Keyword dropdown.
  13. Click the + under Response, and drag the AstronautManager from the Hierarchy into the None (Object) field.
  14. Now, click the No Function dropdown, select AstronautManager, then ResetModelCommand.

How to set-up the Speech Input Source and Handler for chapter 5

Build and Deploy

The End

Congratulations! You have now completed MR Input 211: Gesture.