Skip to content

xR for VR Workflow

The purpose of workflow is to enable the production of Spatial Video (VR180 or VR360) content for consumption in Spatial Computing devices from a typical LED xR production stage, using standard broadcast cameras and lenses, through a combination of spherical rendering and content reprojection tricks.

The workflow is a combination of several features found in the Disguise Designer toolset.

Essentially, we are taking a cubemap rendered in a 360 scene and reprojecting the stage in the correct position within that scene, relative to the zero origin point of the world.

1. Stage Setup

  1. Create a standard xR LED stage in the Disguise software. Add an LED wall and floor to the stage.
  2. Add a broadcast camera. In this example, it is called d3cam.
  3. Setup and calibrate the stage ensuring lens, colour, and delays are fully calibrated.
  4. Position your broadcast camera so that it covers all participants on the stage from head to toe. You can preview this in the stage visualiser and use the mouse to pan, rotate, and zoom the stage into view.

The xR Stage with Test Patterns The xR Stage with Test Patterns

2. Add and Configure Virtual Cameras

In Designer, add 6x virtual cameras. The table below includes suggestions for naming and rotation settings of each virtual camera in the camera array.

Virtual cameraRotation x,y,z
vcam1-front0,0,0
vcam2-left0,-90,0
vcam3-right0,90,0
vcam4-back0,180,0
vcam5 -top90,0,0
vcam6-bottom-90,0,0
  1. Set each virtual camera as a child of the broadcast camera, d3cam. d3 Hierachy d3cam Hierarchy - all virtual cameras are children of d3cam
  2. Ensure the virtual cameras all have the following settings:
    • Parent camera: d3cam
    • Coordinate system: Global
    • Offset: 0,0,0
    • Physical > Lens > Lens source: Local intrinsics
  1. Set all virtual camera resolutions to a square resolution / aspect ratio (eg 1920 x 1920px). Overscan resolution will automatically update to the same resolution.
  2. Orient the virtual cameras to match compass points by rotating the cameras around the Y axis, plus two cameras facing up and down.

Reference cubemap Reference cubemap highlighting virtual camera positions

Camera List Editor Camera List Editor

3. Add 6 MR Sets

  1. Create 6x MR Sets and number them sequentially.
  2. Assign each of the virtual cameras to the Camera override property in each MRset. The table highlights settings for virtual camera naming, positioning, and camera override settings.
Virtual cameraRotation x,y,zmrset
vcam1-front0,0,0mrset1
vcam2-left0,-90,0mrset2
vcam3-right0,90,0mrset3
vcam4-back0,180,0mrset4
vcam5 -top90,0,0mrset5
vcam6-bottom-90,0,0mrset6
  1. On mrset 1, which is front facing, add the wall and floor LEDs to On-Stage.
  2. Make the Output Resolution of all MR sets 1920 x 1920.

MR Sets

MR Set settings

4. Add a Spherical Mesh as a Backplate

  1. In the Stage Editor, create a virtual projection surface, name it sphere, and select the puffersphere mesh.
  2. Scale the sphere to 10, 10, 10, and a resolution of 3840 x 3840.
  3. Set the virtual projection surface to Backplate (MR) in the Render layer.

Add spherical mesh Sphere Scale, Mesh & Render layer: Backplate (MR)

5. Add a StageRender Layer

  1. Add a StageRender layer to the timeline.
  2. Create a new mapping named Spherical.
  3. Set the StageRender mapping to Spherical, and Render layer to Backplate.

Add a StageRender Layer

6. Set up the Output Feed

  1. Open the Feed View by left-clicking on Feed in the Dashboard. Set up the Output Feed
  2. Position each mrset feed so that it forms a cubemap with using the feed output image as a reference using ALT and drag to arrow from the MR Set to the Output window.

7. Add VR180 or VR360 Content

Using a Video Layer

  1. If you are using a Video layer, add the layer and select your media.
  2. Add a Spherical mapping type and set it to 3D, target the virtual LED screen.

Spherical mapping

Using RenderStream

  1. Add a RenderStream Layer to your track.
  2. In the RenderStream Layer Editor, select your VR180 or VR360 asset, such as an Unreal project.
  3. Add a Cluster Pool of 6 RX III render nodes on the layer.
  4. Add a Spherical mapping type and set it to 3D, target the virtual LED screen.
  5. Add a Channel.
  6. Start the Workload.

When you start RenderStream - you should see 6 faces of a cubemap output from d3 ready for translation into VR180 or VR360 content via a third-party tool.