BlackTrax Integration Setup
BlackTrax (BT) is a vision-based tracking system available from CAST, capable of tracking both objects and people. Tracking data from BT can be input into Disguise software and mapped to objects or props within the Disguise visualiser.
The BT system uses an array of cameras placed on the stage rig to observe IR markers (small LEDs) placed on the talent or prop. These active markers generate rigid body information that is sent to disguise as they are observed by an array of cameras mounted on the truss/staging. As long as the marker is observable by two cameras, an accurate positional fix is possible.
The markers are attached to a small BT Beacon unit (usually attached to the talents belt). This unit provides power and control to the IR markers.
BlackTrax can integrate in two ways:
- Actor/talent tracking
- Prop/screen tracking (used for movable scenic pieces)
A BlackTrax server contains the software programs required for configuring a BT system, including BTWYG, BlackTrax, and Motive.
⚠️ Please be aware that the workflow outlined below may be outdated workflow outlined below may be outdated. This section is under review, and updated information will be added as newer workflows are evaluated and confirmed.
For additional information and training on using Blacktrax, please visit the CAST website
Disguise Project Setup
Section titled “Disguise Project Setup”Your visualiser must be set up to resemble the real world environment as closely as possible to avoid any issues that may occur when calibrating. Follow these steps to set up your project in Designer.
1. Creaate a project and enable tracking
Section titled “1. Creaate a project and enable tracking”- Create a project.
- In the d3 project, the project’s internal folder.
- Open the dlls and open dlls.txt.
- Add tracking.dll, then save the text file.
2. Open the project and add a tracking point
Section titled “2. Open the project and add a tracking point”- Open the project, then open the visualiser camera by right-clicking Stage > Camera > Visualiser camera > Visible
- Open the visibility tab and make tracking point and tracking labels visible.
3. Set the projector properties
Section titled “3. Set the projector properties”Set the projector’s properties such as resolution, lens, and position. Also, set the resolution on the feeds; if not set correctly this can influence the calibration process. The real-world projector must not have any warping, Keystone, lens shift or zoom.
4. Add a visual reference
Section titled “4. Add a visual reference”Add a mesh of known dimensions (e.g., a rectangular mesh) at the origin (0,0,0) of the stage within the visualizer. This mesh will act as a visual reference to assess any orientation discrepancies of objects and will later serve as a datum point for calibration procedures.
Integration
Section titled “Integration”Environment Registration
Section titled “Environment Registration”After setting up and configuring the BlackTrax system, the next step is to register a BlackTrax device within your project. Follow the steps below to integrate and align the device for accurate positional tracking:
-
Create the BlackTrax Device In the dashboard, right-click within the device list and select Add Automation Device. Choose BlackTrax from the available options.
-
Assign the Driver Once the device is created, its editor window will open automatically. Navigate to the Driver tab and assign the BlackTraxDriver. Note: Upon successful creation, provisional BlackTrax tracking data will begin streaming into the system. This is visually represented in the visualizer as green points, corresponding to beacon string positions.
-
Prepare the Registration Area Begin the environment registration process to align coordinate systems between BlackTrax and Disguise:
- Mark a 90-degree angle centred on the stage using the same dimensions as the reference mesh created earlier.
- Position tracking stringers at the centre stage (CS) point, and at both endpoints of the right angle (defining the X and Z directions).
Note: Ensure the physical markings precisely match the dimensions and orientation of the virtual datum mesh.
-
Open the Registration Tool Right-click on the existing BlackTrax device and open its editor. Go to the Registration tab.
-
Place Reference Points The cursor will change to a placement tool, indicating readiness to define registration points. Drop three reference points onto the datum mesh:
- Center Stage (CS)
- Positive X direction
- Positive Z direction
-
Link Virtual and Real-World Points Using the registration UI, link each of the dropped reference points to their corresponding live BlackTrax beacon positions. After successfully registering all three points, proceed to finalization.
-
Finalize the Registration Select the Finalise field in the registration UI. This step applies the computed transformation, aligning the virtual space with the real-world BlackTrax coordinate system.
Once complete, the GUI will display aligned reference and tracking points, confirming a successful calibration.
Note: The system receives positional data from BlackTrax at a frequency of 100 Hz, enabling real-time tracking accuracy.
Note: receives positional information from the BT system at a rate of 100Hz.
OSC Setup
Section titled “OSC Setup”At this point all the basic setup of the project will be done. The next section demonstrates how to integrate the OSC external hardware required. The reason for this is because it is the easiest way to control the calibration process in Disguise (note in older versions of the software the only way to calibrate the system is with an OSC interface). Follow these steps to set up OSC:
-
Create an OSC device in Disguise by right clicking on the “device” in the dashboard. Once open there will be a plus and minus field, use this to add and remove devices from the project.
-
Select the +, this will then give you the option to select from a list of commonly used devices. In this list there will be a device called OSC1. Select this device to add it to the project.
-
Once added ensure all the fields in the device are filled incorrectly (use the below table for help):
OSC Hardware Disguise OSC Host d3net port address Port (Outgoing) Port receive Port (incoming) Port send Local IP address OSC hardware IP -
Sync the OSC layout to the hardware with the TouchOSC editor (available here). Now if you change something on the OSC controller you will see the OSC device indicator in the dashboard emit green to show it is receiving a signal.
TouchOSC layout explained
Section titled “TouchOSC layout explained”- Start: Starts the calibration process.
- Cancel: Cancels last placed point so it can be redone.
- Done: Finishes calibration process.
- Skip: Used in case a projector will not be calibrating a point at this time; for instance because it does not have coverage onto the current position of the stringer. Gets used when frustum extremities of singular projectors are being calibrated.
- Trackpad: Used for quick rough placement of the crosshair.
- Directional keyUsed for fine control per click, or movement of the crosshair on a single axes. Crosshair will move faster the longer the button is held.
- Rotate left & right: Used for aligning the movement direction of the crosshair to the orientation of the user that is calibrating.
- Previous & Next: Used to jump between modes of operation.
Projector Calibration
Section titled “Projector Calibration”- Open the BT device, at the bottom of the editor select the field called “BtProjectorCalibrator” this will then open the BT calibration editor.
- Assign the OSC device that will be used to calibrate. (Note: set this device even if not being used).
- Assign the tracked point; this is the string that will be used for the calibration process.
- Assign the projectors used in the BT system.
- Now open touch OSC with the given layout. (or the GUI interface, which can be found under the tab “Remote Calibrator”)
- Press start on the OSC to start the process (A test pattern will flash on the output to show the full output size this is normally the colour of the projectors wireframe)
- Place the stringer in the projectors output. (for example, bottom right of the coloured output)
- Press next on the OSC app, this will now launch a line up grid
- Move the cross-air in the OSC editor to move the cross-air in the projector output. Line this up with the string to calibrate the space.
- When in the correct position click next on the OSC app.
- The software will now show the second projector line up grid and overlap. If the point is in the next projectors field of view, then use the string as a reference point. When lined up push next.
- Repeat step 7 to 11 and create at least 6 reference points per-projector.
- Several points will now be seen in the GUI; these points represent the calibration points.
- When you have finished placing points push Done on the OSC app.
Note: Same steps are used for the GUI interface as it works with the same process.
Note: Calibrate point at different heights and depths to gain the best calibration possible. The more information the system has to process the better to calibration will be.
Now you will see the projector values greyed out, now the space is calibrated, and video will be able to track objects.
Some points may affect the calibration in a bad way in the same way a badly calibrated point in QuickCal can ruin a line up. This can be checked with the following steps:
- Open the editor for one of the projected used for the BlackTrax process.
- Open the projector calibration.
- View the calibration scores, if the sore is poor try muting point to improve the calibration results.
Rigid bodies
Section titled “Rigid bodies”The next step is to assign objects to strings so the object can be tracked. To start this process, follow the steps below:
- Open the BlackTrax device by right clicking on the device in the device list, this then will open the device editor.
- Open the tab “Rigid Body” this will then present you with a plus and minus field, use this to add and remove objects.
- Once you have created a rigid body an editor will open. This will contain the fields seen on the next page:
- Select the desired object.
- The cursor will be a point to represent that it is ready to drop points, drop four reference points on the object. This will be used a static reference points.
- Under the points tab, add as many fields as you would like to track.
- Set your static points and set the tracked points which will be the BlackTrax beacons that are being used.
- Select “calibrate: snap to fixed point” field.
- Now the real-world object should match the orientation and position of the object in the GUI. When the real-world object moves the visualised object should follow
- Edit the different thresholds to make the movement smoother.
Terminology
Section titled “Terminology”Object: Select the object intended to become a RigidBody.
Engaged: Enable or disable tracking for the RigidBody.
Status: Not Calibrated means the object will not be tracked. Can be manually set to calibrated to enable tracking of the body onto the object. When done so the object selected will jump to line up its fixed points to the selected tracked points.
Secondarily, the Calibrated option will be the result of pressing the Calibrate button explained under Points.
Skip: Used in case a projector will not be calibrating a point at this time; for instance because it does not have coverage onto the current position of the stringer. Gets used when frustum extremities of singular projectors are being calibrated.
Iterations: The RigidBody solver is iterative, meaning that for n
iterations (max number iterations being defined by this number) will
attempt to resolve a position rotation matrix which best represents where the RigidBody should be in the co-ordinate system.
Solver mode RigidBody or Actor.
Set to RigidBody will allow tracking of X, Y, Z position and rotation.
Set to Actor will allow for X, Y, Z position tracking, but no rotation. Both solver modes will technically track from 1+ visible stringers and up, though this is not a great idea for rigid bodies in most cases. In case of RigidBody solver multiple stringers are required to be seen to calculate rotation. Actor solver will average incoming and reference positions and calculate the difference on each update. We update at 100Hz, as per the refresh rate of the Motive settings.
Movement Threshold: Keeps Dynamic Blend from recomputing when set to a non-zero value. This is used when static oscillation on almost static objects becomes an issue.
Rotation Threshold: Keeps Dynamic Blend from recomputing when set to a non-zero value.
This is used when static oscillation on almost static objects becomes an issue.
Points: Points are where you select a number of fixed points, placed by clicking on vertices of your rigid body models, and tie them to stringers. Simply select the correct points and set offsets where required.
Tweak: Tweak is used to more accurately align projection onto your rigid body. Use its offsets on XYZ position and rotation until projection lines up. It should be noted that sometimes you may be better off rescaling your model instead, depending on your results.
Constraints: Set these on all axes where you would like to restrict movement. 1 turns on, 0 turns off.
Tracking lost: On tracking lost settings allow you three forms of behaviour in case tracking data is lost.:
Do nothing: Does nothing.
Reset allows you to define a position that rigid bodies will jump to in case tracking data is lost.
Follow sequencing means will follow any ScreenPositionModule programmed on the timeline as a form of redundancy.