mixed-reality

Keyboard input in Unity

Namespace: UnityEngine
Type: TouchScreenKeyboard

While HoloLens supports many forms of input including Bluetooth keyboards, most applications can’t assume that all users will have a physical keyboard available. If your application requires text input, some form of on-screen keyboard should be provided.

Unity provides the TouchScreenKeyboard class for accepting keyboard input when there’s no physical keyboard available.

HoloLens system keyboard behavior in Unity

On HoloLens, the TouchScreenKeyboard leverages the system’s on-screen keyboard and directly overlays on top of the volumetric view of your MR application. The experience is similar to using keyboard in the built-in apps of HoloLens. Note that the system keyboard will behave according to the target platform’s capabilities, for example the keyboard on HoloLens 2 would support direct hand interactions, while the keyboard on HoloLens (1st gen) would support GGV (Gaze, Gesture, and Voice). Additionally, the system keyboard will not show up when performing Unity Remoting from the editor to a HoloLens.

Using the system keyboard in your Unity app

Declare the keyboard

In the class, declare a variable to store the TouchScreenKeyboard and a variable to hold the string the keyboard returns.

UnityEngine.TouchScreenKeyboard keyboard;
public static string keyboardText = "";

Invoke the keyboard

When an event occurs requesting keyboard input, use the following to show the keyboard.

keyboard = TouchScreenKeyboard.Open("text to edit");

You can use additional parameters passed into the TouchScreenKeyboard.Open function to control the behavior of the keyboard (e.g. setting placeholder text or supporting autocorrection). For the full list of parameters please refer to Unity’s documentation.

Retrieve typed contents

The content can simply be retrieved by calling keyboard.text. You may want to retrieve the content per frame or only when the keyboard is closed.

keyboardText = keyboard.text;

Alternative keyboard options

Besides using the TouchScreenKeyboard class directly, you can also get user input by using Unity’s UI Input Field or TextMeshPro Input Field. Additionally, there is an implementation based on TouchScreenKeyboard in the HandInteractionExamples scene of MRTK (there is a keyboard interaction sample on the left hand side).

Next Development Checkpoint

If you’re following the Unity development journey we’ve laid out, you’re in the midst of exploring the Mixed Reality platform capabilities and APIs. From here, you can continue to any topic or jump directly to deploying your app on a device or emulator.

[!div class=”nextstepaction”] Deploy to HoloLens or Windows Mixed Reality immersive headsets