[!IMPORTANT] The PV camera is implemented in both Windows Mixed Reality and OpenXR plugins. However, OpenXR needs the Microsoft OpenXR plugin to be installed. Also, OpenXR for Unreal 4.26 has a limitation: camera can work with DirectX11 RHI. This limitation is fixed in Unreal 4.27.1 or later.
The camera on the HoloLens 2 is offset vertically from the device’s head tracking. A few functions exist to locate the camera in world space to account for the offset.
GetPVCameraToWorldTransform gets the transform in world space of the PV Camera and will be positioned on the camera lens:
GetWorldSpaceRayFromCameraPoint casts a ray from the camera lens into the scene in Unreal world space to find a pixel’s content in the camera frame:
GetPVCameraIntrinsics returns the camera intrinsic values, which can be used when doing computer vision processing on a camera frame:
To find what exists in world space at a particular pixel coordinate, use a line trace with the world space ray:
Here we cast a 2-meter ray from the camera lens to the camera-space position ¼ from the top left of the frame. Then use the hit result to render something where the object exists in world space:
When using spatial mapping, this hit position will match the surface that the camera is seeing.
PublicDependencyModuleNames.AddRange(
new string[] {
"Core",
"CoreUObject",
"Engine",
"InputCore",
"AugmentedReality"
});
#include "ARBlueprintLibrary.h"
private:
UStaticMesh* StaticMesh;
UStaticMeshComponent* StaticMeshComponent;
UMaterialInstanceDynamic* DynamicMaterial;
bool IsTextureParamSet = false;
ACamCapture::ACamCapture()
{
PrimaryActorTick.bCanEverTick = true;
// Load a mesh from the engine to render the camera feed to.
StaticMesh = LoadObject<UStaticMesh>(nullptr, TEXT("/Engine/EngineMeshes/Cube.Cube"), nullptr, LOAD_None, nullptr);
// Create a static mesh component to render the static mesh
StaticMeshComponent = CreateDefaultSubobject<UStaticMeshComponent>(TEXT("CameraPlane"));
StaticMeshComponent->SetStaticMesh(StaticMesh);
// Scale and add to the scene
StaticMeshComponent->SetWorldScale3D(FVector(0.1f, 1, 1));
this->SetRootComponent(StaticMeshComponent);
}
In BeginPlay create a dynamic material instance from the project’s camera material, apply it to the static mesh component, and start the HoloLens camera.
In the editor, right-click on the CamTextureMaterial in the content browser and select “Copy Reference” to get the string for CameraMatPath.
void ACamCapture::BeginPlay()
{
Super::BeginPlay();
// Create a dynamic material instance from the game's camera material.
// Right-click on a material in the project and select "Copy Reference" to get this string.
FString CameraMatPath("Material'/Game/Materials/CamTextureMaterial.CamTextureMaterial'");
UMaterial* BaseMaterial = (UMaterial*)StaticLoadObject(UMaterial::StaticClass(), nullptr, *CameraMatPath, nullptr, LOAD_None, nullptr);
DynamicMaterial = UMaterialInstanceDynamic::Create(BaseMaterial, this);
// Use the dynamic material instance when rendering the camera mesh.
StaticMeshComponent->SetMaterial(0, DynamicMaterial);
// Start the webcam.
UARBlueprintLibrary::ToggleARCapture(true, EARCaptureType::Camera);
}
In Tick get the texture from the camera, set it to the texture parameter in the CamTextureMaterial material, and scale the static mesh component by the camera frame’s aspect ratio:
void ACamCapture::Tick(float DeltaTime)
{
Super::Tick(DeltaTime);
// Dynamic material instance only needs to be set once.
if(IsTextureParamSet)
{
return;
}
// Get the texture from the camera.
UARTexture* ARTexture = UARBlueprintLibrary::GetARTexture(EARTextureType::CameraImage);
if(ARTexture != nullptr)
{
// Set the shader's texture parameter (named "Param") to the camera image.
DynamicMaterial->SetTextureParameterValue("Param", ARTexture);
IsTextureParamSet = true;
// Get the camera instrincs
FARCameraIntrinsics Intrinsics;
UARBlueprintLibrary::GetCameraIntrinsics(Intrinsics);
// Scale the camera mesh by the aspect ratio.
float R = (float)Intrinsics.ImageResolution.X / (float)Intrinsics.ImageResolution.Y;
StaticMeshComponent->SetWorldScale3D(FVector(0.1f, R, 1));
}
}
[!NOTE] This requires Unreal Engine 4.25 or newer.
The system and custom MRC recorders create mixed reality captures by combining the PV Camera with holograms rendered by the app.
By default, mixed reality capture uses the right eye’s holographic output. If an immersive app chooses to render from the PV Camera, then that will be used instead. Rendering from the PV Camera improves the mapping between the real world and the holograms in the MRC video.
To opt in to rendering from the PV Camera:
Unreal will then handle requests from MRC to render from the PV Camera’s perspective.
[!NOTE] Only when Mixed Reality Capture is triggered will the app be asked to render from the photo/video camera’s perspective.
The webcam texture can be retrieved in the game at runtime, but it needs to be enabled in the editor’s Edit > Project Settings:
To render the camera image: