In the previous tutorial, you added an ARSession, Pawn, and Game Mode to complete the mixed reality setup for the chess app. This section focuses on using the open source Mixed Reality Toolkit UX Tools plugin, which provides tools to make the scene interactive. By the end of this section, your chess pieces will be moving by user input.
Before you start working with user input, you’ll need to add the Mixed Reality UX Tools plugin to the project. To learn more about UX Tools, you can check out the project on GitHub.
[!NOTE] If you don’t see the UXTools Content section in the Content Browser, check View Options > Show Engine Content and *View Options > Show Plugin Content.
Additional plugin documentation can be found on the Mixed Reality UX Tools GitHub repository.
With the plugin installed, you’re ready to start using the tools it has to offer, starting with hand interaction actors.
Hand interaction with UX elements is done with Hand Interaction Actors, which create and drive the pointers and visuals for near and far interactions.
In our case, adding a Hand Interaction Actor to MRPawn will:
We recommend reading through the documentation on hand interactions before continuing.
Once you’re ready, open the MRPawn Blueprint and go to the Event Graph.
Your Event Graph should match the following screenshot:
Both Uxt Hand Interaction Actors need owners and initial transform locations. The initial transform doesn’t matter in this case because UX Tools will have the Hand Interaction Actors jump to the virtual hands as soon as they’re visible. However, the SpawnActor
function requires a Transform input to avoid a compiler error, so you’ll use the default values.
Make sure the connections match the following screenshot, but feel free to drag around nodes to make your Blueprint more readable.
You can find more information about Hand Interaction Actors in the UX Tools documentation.
Now the virtual hands in the project have a way of selecting objects, but they still can’t manipulate them. Your last task before testing the app is to add Manipulator components to the actors in the scene.
A Manipulator is a component that responds to articulated hand input and can be grabbed, rotated, and translated. Applying the Manipulator’s transform to an Actors transform allows direct Actor manipulation.
You can find more information about the Manipulator Components provided in the Mixed Reality UX Tools plugin in the documentation.
Good news everyone! You’re ready to test out the app with its new virtual hands and user input. Press Play in the Main Window and you’ll see two mesh hands with rays extending from each hand’s palm. You can control the hands and their interactions as follows:
[!NOTE] Input simulation may not work if you have multiple headsets plugged into your PC. If you’re having issues, try unplugging your other headsets.
Try using the simulated hands to pick up, move, and set down the white chess king and manipulate the board! Experiment with both near and far interaction - notice that when your hands get close enough to grab the board and king directly, a finger cursor at the tip of the index finger replaces the hand ray.
You can find more information about the simulated hands feature provided by the MRTK UX Tools plugin in the documentation.
Now that your virtual hands can interact with objects, you’re ready to move on to the next tutorial and add user interfaces and events.
Next Section: 5. Adding a button & resetting piece locations