xR Stage Setup
This topic covers setting up your virtual stage and cameras in Designer for xR calibration of an LED stage.
Workflow
- Add LED screens to the stage. For the xR workflow, LIDAR scanned and UV unwrapped OBJ meshes are best.
- Add virtual cameras to the stage.
- Connect the video output of the physical stage camera to a video input on the Disguise server.
- Check and configure physical camera settings, including white balance, framerate, and genlock status.
- Patch the video input(s) to the virtual cameras in the stage in the Video Input Patch Editor.
- Run preview in the video input patch editor to ensure the camera feed is being received at the correct signal.
- Check & configure camera tracking system.
- Create a position receiver to link to the camera tracking system.
- Add tracking drivers dependent on your camera tracking system to the position receiver.
- Engage the camera tracking drivers.
- Assign the driver in Designer to a virtual camera.
- Monitor the incoming data from the camera tracking system to ensure it is being received in time.
- Create an MR set.
- Optionally, add a set extension mesh.
- Add LED screens to the MR set.
- Add camera to the MR set if required.
- If using multiple cameras, configure and test the indirection controller.