xR for VR Workflow
The purpose of workflow is to enable the production of Spatial Video (VR180 or VR360) content for consumption in Spatial Computing devices from a typical LED xR production stage, using standard broadcast cameras and lenses, through a combination of spherical rendering and content reprojection tricks.
The workflow is a combination of several features found in the Disguise Designer toolset.
Essentially, we are taking a cubemap rendered in a 360 scene and reprojecting the stage in the correct position within that scene, relative to the zero origin point of the world.
1. Stage Setup
- Create a standard xR LED stage in the Disguise software. Add an LED wall and floor to the stage.
- Add a broadcast camera. In this example, it is called d3cam.
- Setup and calibrate the stage ensuring lens, colour, and delays are fully calibrated.
- Position your broadcast camera so that it covers all participants on the stage from head to toe. You can preview this in the stage visualiser and use the mouse to pan, rotate, and zoom the stage into view.
The xR Stage with Test Patterns
2. Add and Configure Virtual Cameras
In Designer, add 6x virtual cameras. The table below includes suggestions for naming and rotation settings of each virtual camera in the camera array.
Virtual camera | Rotation x,y,z |
---|---|
vcam1-front | 0,0,0 |
vcam2-left | 0,-90,0 |
vcam3-right | 0,90,0 |
vcam4-back | 0,180,0 |
vcam5 -top | 90,0,0 |
vcam6-bottom | -90,0,0 |
- Set each virtual camera as a child of the broadcast camera, d3cam. d3cam Hierarchy - all virtual cameras are children of d3cam
- Ensure the virtual cameras all have the following settings:
- Parent camera: d3cam
- Coordinate system: Global
- Offset: 0,0,0
- Physical > Lens > Lens source:
Local intrinsics
- Set all virtual camera resolutions to a square resolution / aspect ratio (eg 1920 x 1920px). Overscan resolution will automatically update to the same resolution.
- Orient the virtual cameras to match compass points by rotating the cameras around the Y axis, plus two cameras facing up and down.
Reference cubemap highlighting virtual camera positions
Camera List Editor
3. Add 6 MR Sets
- Create 6x MR Sets and number them sequentially.
- Assign each of the virtual cameras to the Camera override property in each MRset. The table highlights settings for virtual camera naming, positioning, and camera override settings.
Virtual camera | Rotation x,y,z | mrset |
---|---|---|
vcam1-front | 0,0,0 | mrset1 |
vcam2-left | 0,-90,0 | mrset2 |
vcam3-right | 0,90,0 | mrset3 |
vcam4-back | 0,180,0 | mrset4 |
vcam5 -top | 90,0,0 | mrset5 |
vcam6-bottom | -90,0,0 | mrset6 |
- On mrset 1, which is front facing, add the wall and floor LEDs to On-Stage.
- Make the Output Resolution of all MR sets 1920 x 1920.
4. Add a Spherical Mesh as a Backplate
- In the Stage Editor, create a virtual projection surface, name it sphere, and select the puffersphere mesh.
- Scale the sphere to 10, 10, 10, and a resolution of 3840 x 3840.
- Set the virtual projection surface to Backplate (MR) in the Render layer.
Sphere Scale, Mesh & Render layer: Backplate (MR)
5. Add a StageRender Layer
- Add a StageRender layer to the timeline.
- Create a new mapping named Spherical.
- Set the StageRender mapping to Spherical, and Render layer to Backplate.
6. Set up the Output Feed
- Open the Feed View by left-clicking on Feed in the Dashboard.
- Position each mrset feed so that it forms a cubemap with using the feed output image as a reference using ALT and drag to arrow from the MR Set to the Output window.
7. Add VR180 or VR360 Content
Using a Video Layer
- If you are using a Video layer, add the layer and select your media.
- Add a Spherical mapping type and set it to 3D, target the virtual LED screen.
Using RenderStream
- Add a RenderStream Layer to your track.
- In the RenderStream Layer Editor, select your VR180 or VR360 asset, such as an Unreal project.
- Add a Cluster Pool of 6 RX III render nodes on the layer.
- Add a Spherical mapping type and set it to 3D, target the virtual LED screen.
- Add a Channel.
- Start the Workload.
When you start RenderStream - you should see 6 faces of a cubemap output from d3 ready for translation into VR180 or VR360 content via a third-party tool.