If you’re new to Holographic Remoting, you may want to read our overview.
[!IMPORTANT] This document describes the creation of a remote application for HoloLens 2 and Windows Mixed Reality headsets using the OpenXR API. Remote applications for HoloLens (1st gen) must use NuGet package version 1.x.x. This implies that remote applications written for HoloLens 2 are not compatible with HoloLens 1 and vice versa. The documentation for HoloLens 1 can be found here.
Holographic Remoting apps can stream remotely rendered content to HoloLens 2 and Windows Mixed Reality immersive headsets. You can also access more system resources and integrate remote immersive views into existing desktop PC software. A remote app receives an input data stream from HoloLens 2, renders content in a virtual immersive view, and streams content frames back to HoloLens 2. The connection is made using standard Wi-Fi. Holographic Remoting is added to a desktop or UWP app via a NuGet packet. Additional code is required which handles the connection and renders in an immersive view. A typical remoting connection will have as low as 50 ms of latency. The player app can report the latency in real time.
All code on this page and working projects can be found in the Holographic Remoting samples github repository.
A good starting point is a working OpenXR based Desktop or UWP app. For details see Getting started with OpenXR.
[!IMPORTANT] Any app using Holographic Remoting should be authored to use a multi-threaded apartment. The use of a single-threaded apartment is supported but will lead to sub-optimal performance and possibly stuttering during playback. When using C++/WinRT winrt::init_apartment a multi-threaded apartment is the default.
The following steps are required to add the NuGet package to a project in Visual Studio.
[!NOTE] Version 1.x.x of the NuGet package is still available for developers who want to target HoloLens 1. For details see Add Holographic Remoting (HoloLens (1st gen)).
The first step you need to do in your remote app is to select the Holographic Remoting OpenXR runtime, which is part of the Microsoft.Holographic.Remoting.OpenXr NuGet package. You can do this by setting the XR_RUNTIME_JSON
environment variable to the path of the RemotingXR.json file within your app. This environment variable is used by the OpenXR loader to not use the system default OpenXR runtime but instead redirect to the Holographic Remoting OpenXR runtime. When using the Microsoft.Holographic.Remoting.OpenXr NuGet package the RemotingXR.json file is automatically copied during compilation to the output folder, the OpenXR runtime selection typically looks as follows.
bool EnableRemotingXR() {
wchar_t executablePath[MAX_PATH];
if (GetModuleFileNameW(NULL, executablePath, ARRAYSIZE(executablePath)) == 0) {
return false;
}
std::filesystem::path filename(executablePath);
filename = filename.replace_filename("RemotingXR.json");
if (std::filesystem::exists(filename)) {
SetEnvironmentVariableW(L"XR_RUNTIME_JSON", filename.c_str());
return true;
}
return false;
}
The first actions a typical OpenXR app should take are select OpenXR extensions and create an XrInstance. The OpenXR core specification doesn’t provide any remoting specific API. For that reason, Holographic Remoting introduces its own OpenXR extension named XR_MSFT_holographic_remoting
. Ensure that XR_MSFT_HOLOGRAPHIC_REMOTING_EXTENSION_NAME
is included in the XrInstanceCreateInfo of the xrCreateInstance call.
[!TIP] By default the rendered content of your app is only streamed to the Holographic Remoting player either running on a HoloLens 2 or on a Windows Mixed Reality headsets. To also display the rendered content on the remote PC, via a swap-chain of a window for instance, Holographic Remoting provides a second OpenXR extension named
XR_MSFT_holographic_remoting_frame_mirroring
. Ensure to also enable this extension usingXR_MSFT_HOLOGRAPHIC_REMOTING_FRAME_MIRRORING_EXTENSION_NAME
in case you want to use that functionality.
[!IMPORTANT] To learn about the Holographic Remoting OpenXR extension API, check out the specification which can be found in the Holographic Remoting samples github repository.
After your remote app has created the XrInstance and queried the XrSystemId via xrGetSystem, a connection to the player device can be established.
[!WARNING] The Holographic Remoting OpenXR runtime is only able to provide device specific data such as view configurations or environment blend modes after a connection has been established.
xrEnumerateViewConfigurations
,xrEnumerateViewConfigurationViews
,xrGetViewConfigurationProperties
,xrEnumerateEnvironmentBlendModes
, andxrGetSystemProperties
will give you default values, matching what you would typically get if you connect to a player running on a HoloLens 2, before being fully connected. It is strongly recommended to not call these methods before a connection has been established. The suggestion is used these methods after the XrSession has been successfully created and the session state is at least XR_SESSION_STATE_READY.
General properties such as max bitrate, audio enabled, video codec, or depth buffer stream resolution can be configured via xrRemotingSetContextPropertiesMSFT
as follows.
XrRemotingRemoteContextPropertiesMSFT contextProperties;
contextProperties = XrRemotingRemoteContextPropertiesMSFT{static_cast<XrStructureType>(XR_TYPE_REMOTING_REMOTE_CONTEXT_PROPERTIES_MSFT)};
contextProperties.enableAudio = false;
contextProperties.maxBitrateKbps = 20000;
contextProperties.videoCodec = XR_REMOTING_VIDEO_CODEC_H265_MSFT;
contextProperties.depthBufferStreamResolution = XR_REMOTING_DEPTH_BUFFER_STREAM_RESOLUTION_HALF_MSFT;
xrRemotingSetContextPropertiesMSFT(m_instance.Get(), m_systemId, &contextProperties);
The connection can be done in one of two ways.
1) The remote app connects to the player running on the device. 2) The player running on the device connects to the remote app.
To establish a connection from the remote app to the player device, call the xrRemotingConnectMSFT
method specifying the hostname and port via the XrRemotingConnectInfoMSFT
structure. The port used by the Holographic Remoting Player is 8265.
XrRemotingConnectInfoMSFT connectInfo{static_cast<XrStructureType>(XR_TYPE_REMOTING_CONNECT_INFO_MSFT)};
connectInfo.remoteHostName = "192.168.x.x";
connectInfo.remotePort = 8265;
connectInfo.secureConnection = false;
xrRemotingConnectMSFT(m_instance.Get(), m_systemId, &connectInfo);
Listening for incoming connections on the remote app can be done by calling the xrRemotingListenMSFT
method. Both the handshake port and transport port can be specified via the XrRemotingListenInfoMSFT
structure. The handshake port is used for the initial handshake. The data is then sent over the transport port. By default 8265 and 8266 are used.
XrRemotingListenInfoMSFT listenInfo{static_cast<XrStructureType>(XR_TYPE_REMOTING_LISTEN_INFO_MSFT)};
listenInfo.listenInterface = "0.0.0.0";
listenInfo.handshakeListenPort = 8265;
listenInfo.transportListenPort = 8266;
listenInfo.secureConnection = false;
xrRemotingListenMSFT(m_instance.Get(), m_systemId, &listenInfo);
The connection state must be disconnected when you call xrRemotingConnectMSFT
or xrRemotingListenMSFT
. You can get the connection state at any point after you have created an XrInstance and queried for the XrSystemId via xrRemotingGetConnectionStateMSFT
.
XrRemotingConnectionStateMSFT connectionState;
xrRemotingGetConnectionStateMSFT(m_instance.Get(), m_systemId, &connectionState, nullptr);
Available connection states are:
[!IMPORTANT]
xrRemotingConnectMSFT
orxrRemotingListenMSFT
must be called before trying to create a XrSession via xrCreateSession. If you try to create a XrSession while the connection state isXR_REMOTING_CONNECTION_STATE_DISCONNECTED_MSFT
the session creation will succeed but the session state will immediately transition to XR_SESSION_STATE_LOSS_PENDING.
Holographic Remoting’s implementation of xrCreateSession
supports waiting for a connection to be established. You can call xrRemotingConnectMSFT
or xrRemotingListenMSFT
immediately followed by a call to xrCreateSession
, which will block and wait for a connection to be established. The timeout with xrRemotingConnectMSFT
is fixed to 10 seconds and unlimited with xrRemotingListenMSFT
. If a connection can be established within this time, the XrSession creation will succeed and the session state will transition to XR_SESSION_STATE_READY. In case no connection can be established the session creation also succeeds but immediately transitions to XR_SESSION_STATE_LOSS_PENDING.
In general, the connection state is couple with the XrSession state. Any change to the connection state also affects the session state. For instance, if the connection state switches from XR_REMOTING_CONNECTION_STATE_CONNECTED_MSFT
to XR_REMOTING_CONNECTION_STATE_DISCONNECTED_MSFT
the session state will transition to XR_SESSION_STATE_LOSS_PENDING as well.
The Holographic Remoting OpenXR runtime exposes three events, which are important to monitor the state of a connection.
1) XR_TYPE_REMOTING_EVENT_DATA_CONNECTED_MSFT
: Triggered when a connection to the device has been successfully established.
2) XR_TYPE_REMOTING_EVENT_DATA_DISCONNECTED_MSFT
: Triggered if an established connection is closed or a connection couldn’t be established.
3) XR_TYPE_REMOTING_EVENT_DATA_LISTENING_MSFT
: When listening for incoming connections starts.
These events are placed in a queue and your remote app must read from the queue with regularity via xrPollEvent
.
auto pollEvent = [&](XrEventDataBuffer& eventData) -> bool {
eventData.type = XR_TYPE_EVENT_DATA_BUFFER;
eventData.next = nullptr;
return CHECK_XRCMD(xrPollEvent(m_instance.Get(), &eventData)) == XR_SUCCESS;
};
XrEventDataBuffer eventData{};
while (pollEvent(eventData)) {
switch (eventData.type) {
...
case XR_TYPE_REMOTING_EVENT_DATA_LISTENING_MSFT: {
DEBUG_PRINT("Holographic Remoting: Listening on port %d",
reinterpret_cast<const XrRemotingEventDataListeningMSFT*>(&eventData)->listeningPort);
break;
}
case XR_TYPE_REMOTING_EVENT_DATA_CONNECTED_MSFT: {
DEBUG_PRINT("Holographic Remoting: Connected.");
break;
}
case XR_TYPE_REMOTING_EVENT_DATA_DISCONNECTED_MSFT: {
DEBUG_PRINT("Holographic Remoting: Disconnected - Reason: %d",
reinterpret_cast<const XrRemotingEventDataDisconnectedMSFT*>(&eventData)->disconnectReason);
break;
}
}
To display the same content in the remote app that is sent to the device the XR_MSFT_holographic_remoting_frame_mirroring
extension can be used. With this extension, you can submit a texture to xrEndFrame by using the XrRemotingFrameMirrorImageInfoMSFT
that isn’t chained to the XrFrameEndInfo as follows.
XrFrameEndInfo frameEndInfo{XR_TYPE_FRAME_END_INFO};
...
XrRemotingFrameMirrorImageD3D11MSFT mirrorImageD3D11{
static_cast<XrStructureType>(XR_TYPE_REMOTING_FRAME_MIRROR_IMAGE_D3D11_MSFT)};
mirrorImageD3D11.texture = m_window->GetNextSwapchainTexture();
XrRemotingFrameMirrorImageInfoMSFT mirrorImageEndInfo{
static_cast<XrStructureType>(XR_TYPE_REMOTING_FRAME_MIRROR_IMAGE_INFO_MSFT)};
mirrorImageEndInfo.image = reinterpret_cast<const XrRemotingFrameMirrorImageBaseHeaderMSFT*>(&mirrorImageD3D11);
frameEndInfo.next = &mirrorImageEndInfo;
xrEndFrame(m_session.Get(), &frameEndInfo);
m_window->PresentSwapchain();
The example above uses a DX11 swap chain texture and presents the window immediately after the call to xrEndFrame. The usage isn’t restricted to swap chain textures. Furthermore, no additional GPU synchronization is required. For details on usage and constraints, check out the extension specification. If your remote app is using DX12, use XrRemotingFrameMirrorImageD3D12MSFT instead of XrRemotingFrameMirrorImageD3D11MSFT.
Starting with version 2.5.0, custom data channels can be used with the OpenXR API to send user data over the already-established remoting connection. For more information, see Custom Data Channels with the OpenXR API.
Starting with version 2.6.0, the XR_MSFT_holographic_remoting_speech
extension allows the remote app to react to speech commands detected by the player app with the OpenXR API.
[!IMPORTANT] The detailed specification can be found in the Holographic Remoting samples github repository.
To initialize a speech recognizer on the player app, the remote app can call xrInitializeRemotingSpeechMSFT
.
This call transmits speech initialization parameters, which consist of a language, a dictionary of phrases, and the contents of a grammar file, to the player app.
[!NOTE] Before version 2.6.1 the speech recognizer must only be initialized once per
XrSession
.
If the creation of the speech recognizer succeeded, as indicated by the XR_TYPE_EVENT_DATA_REMOTING_SPEECH_RECOGNIZER_STATE_CHANGED_MSFT
event, the remote app will be notified when a speech recognition result was generated on the player app.
The XrEventDataRemotingSpeechRecognizerStateChangedMSFT
event structure is placed in the event queue when the state of the speech recognizer on the player side changes.
XrRemotingSpeechRecognizerStateMSFT
defines all possible states of the speech recognizer on the player side and the XrEventDataRemotingSpeechRecognizedMSFT
event structure is placed in the event queue if the speech recognizer on the player side has a recognized phrase.
After the remote app is notified about a recognized phrase, it can retrieve the recognized phrase by calling xrRetrieveRemotingSpeechRecognizedTextMSFT
.
[!NOTE] The
XrRemotingSpeechRecognitionConfidenceMSFT
is a direct mapping of the SpeechRecognitionConfidence enum returned with the speech recognition result by the Windows Speech Recognition API.
Starting with version 2.7.0, coordinate system synchronization can be used to align spatial data between the player and remote app. For more information, see Coordinate System Synchronization with Holographic Remoting Overview.