BlackTrax Overview
BlackTrax Overview
BlackTrax is a vision-based system that connects to different third-party applications, such as robotic lights, media servers, and any other third party that accepts the RTTrPM open protocol
Overview
Setup
Integration
BlackTrax Overview
BlackTrax (BT) is a vision based tracking system available from CAST, capable of tracking both objects and people. Tracking data from BT can be input into and mapped to objects or props within the Disguise visualiser.
The BT system uses an array of cameras placed on the stage rig to observe IR markers (small LEDs) placed on the talent or prop. These active markers generate rigid body information that is sent to Disguise as they are observed by an array of cameras mounted on the truss/staging. As long as the marker is observable by two cameras an accurate positional fix is possible
The markers are attached to a small BT Beacon unit (usually attached to the talents belt). This unit provides power and control to the IR markers.
BlackTrax can integrate with in two ways:
-
Actor/talent tracking
-
Prop/screen tracking (used for movable scenic pieces)
A BlackTrax server contains the software programs required for configuring a BT system including BTwyg, BlackTrax, and Motive.
For additional information and training on using Blacktrax, please visit the CAST website
Disguise Project Setup
Project Setup
It is important that your visualizer is setup to resemble the real world environment as closely as possible to avoid any issues that may occur when calibrating. Follow these steps to setup your project within :
-
Create a project; once created you will need to enable tracking. To start this process, open the project’s internal folder.
-
Open the “dlls”, once open you will see a notepad document; open and type the following into the document then save.
tracking.dll
-
Now the project can be opened. Once launched, open the visualiser camera (see path below). Then open the visibility tab and make “tracking point” and “tracking labels” visible.
Stage (right click) > Camera > Visualiser camera > Visible
-
Set the projectors properties such as resolution, lens, and position. Also set the resolution on the feeds; if not set correctly this can influence the calibration process.
Note: The real-world projector must not have any warping, keystone, lens shift or zoom.
-
The last thing that you will need is a mesh of a known size placed on the zero point of the stage within the visualizer. This is normally a rectangle mesh, used so that you can tell the differences in the orientation of the object. This will be later used as a datum for calibration.
Integration
Environment Registration
Once the BlackTrax system has been properly setup and configured, then the next step is to add a BlackTrax device into your project. Follow these steps to configure the BT device within :
-
Create a BlackTrax device by right clicking on the device in the dashboard and add an automation device.
-
Once created this will open an editor for the device. Open the driver tab and select the “BlackTraxDriver”.
Note: Once created you will see provisional BlackTrax data coming into the system indicated by green points in the visualizer. These points represent beacon string positions.
-
Next, start the registration process which is needed in order to set the zero point and orientation so BlackTrax and Disguise have matched position data. First mark a 90-degree angle on the centre of the stage the same size as the mesh created earlier as this will be used as a datum. Place a stringer on the CS point and the end points of the right angle.
Note: Ensure the marking in the real world matches the datum mesh created earlier.
-
Right click on the already created BlackTrax driver to open the editor. Now open the registration tab.
-
This will now change the cursor to a point; this is to show that the software is ready to drop points. Drop three points on the datum created earlier. (centre stage, X, and Z)
-
Now use the registration UI to link the reference points and BlackTrax points together. Once the registration of the three points is completed move on to the next step.
-
Select the field called “Finalise”. This will then take the results from the registration and use them to orientate the space. This will normally be seen in the GUI as the reference and BlackTrax points matching.
Note: receives positional information from the BT system at a rate of 100Hz.
OSC Setup
At this point all the basic setup of the project will be done. The next section demonstrates how to integrate the OSC external hardware required. The reason for this is because it is the easiest way to control the calibration process in Disguise (note in older versions of the software the only way to calibrate the system is with an OSC interface). Follow these steps to set up OSC:
-
Create an OSC device in Disguise by right clicking on the “device” in the dashboard. Once open there will be a plus and minus field, use this to add and remove devices from the project.
-
Select the +, this will then give you the option to select from a list of commonly used devices. In this list there will be a device called OSC1. Select this device to add it to the project.
-
Once added ensure all the fields in the device are filled in correctly (use the below table for help):
OSC Hardware Disguise OSC Host d3net port address Port (Outgoing) Port receive Port (incoming) Port send Local IP address OSC hardware IP -
Sync the OSC layout to the hardware with the TouchOSC editor (available here). Now if you change something on the OSC controller you will see the OSC device indicator in the dashboard emit green to show it is receiving a signal.
TouchOSC layout explained
**Start:**Starts the calibration process.
**Cancel:**Cancels last placed point so it can be redone.
**Done:**Finishes calibration process.
**Skip:**Used in case a projector will not be calibrating a point at this time; for instance because it does not have coverage onto the current position of the stringer. Gets used when frustum extremities of singular projectors are being calibrated.
**Trackpad:**Used for quick rough placement of the crosshair.
Directional keyUsed for fine control per click, or movement of the crosshair on a single axes. Crosshair will move faster the longer the button is held
**Rotate left & right:**Used for aligning the movement direction of the crosshair to theorientation of the user that is calibrating.
**Previous & Next:**Used to jump between modes of operation.
Projector Calibration
-
Open the BT device, at the bottom of the editor select the field called “BtProjectorCalibrator” this will then open the BT calibration editor.
-
Assign the OSC device that will be used to calibrate. (Note: set this device even if not being used)
-
Assign the tracked point; this is the string that will be used for the calibration process.
-
Assign the projectors used in the BT system.
-
Now open touch OSC with the given layout. (or the GUI interface, which can be found under the tab “Remote Calibrator”)
-
Press start on the OSC to start the process (A test pattern will flash on the output to show the full output size this is normally the colour of the projectors wireframe)
-
Place the stringer in the projectors output. (for example, bottom right of the coloured output)
-
Press next on the OSC app, this will now launch a line up grid
-
Move the cross-air in the OSC editor to move the cross-air in the projector output. Line this up with the string to calibrate the space.
-
When in the correct position click next on the OSC app.
-
The software will now show the second projector line up grid and overlap. If the point is in the next projectors field of view, then use the string as a reference point. When lined up push next.
-
Repeat step 7 to 11 and create at least 6 reference points per-projector.
-
Several points will now be seen in the GUI; these points represent the calibration points.
-
When you have finished placing points push Done on the OSC app.
Note: Same steps are used for the GUI interface as it works with the same process.
Note: Calibrate point at different heights and depths to gain the best calibration possible. The more information the system has to process the better to calibration will be.
Now you will see the projector values greyed out, now the space is calibrated, and video will be able to track objects.
Some points may affect the calibration in a bad way in the same way a badly calibrated point in QuickCal can ruin a line up. This can be checked with the following steps:
-
Open the editor for one of the projected used for the BlackTrax process.
-
Open the projector calibration.
-
View the calibration scores, if the sore is poor try muting point to improve the calibration results.
Rigid bodies
The next step is to assign objects to strings so the object can be tracked. To start this process, follow the steps below:
-
Open the BlackTrax device by right clicking on the device in the device list, this then will open the device editor.
-
Open the tab “Rigid Body” this will then present you with a plus and minus field, use this to add and remove objects.
-
Once you have created a rigid body an editor will open. This will contain the fields seen on the next page:
-
Select the desired object.
-
The cursor will be a point to represent that it is ready to drop points, drop four reference points on the object. This will be used a static reference points.
-
Under the points tab, add as many fields as you would like to track.
-
Set your static points and set the tracked points which will be the BlackTrax beacons that are being used.
-
Select “calibrate: snap to fixed point” field
-
Now the real-world object should match the orientation and position of the object in the GUI. When the real-world object moves the visualised object should follow
-
Edit the different thresholds to make the movement smoother.
Terminology
**Object:**Select the object intended to become a RigidBody.
**Engaged:**Enable or disable tracking for the RigidBody.
**Status:**Not Calibrated means the object will not be tracked. Can be manually set to calibrated to enable tracking of the body onto the object. When done so the object selected will jump to line up its fixed points to the selected tracked points.
Secondarily the Calibrated option will be the result of pressing the Calibrate button explained under Points .
**Skip:**Used in case a projector will not be calibrating a point at this time; for instance because it does not have coverage onto the current position of the stringer. Gets used when frustum extremities of singular projectors are being calibrated.
**Iterations:**The RigidBody solver is iterative, meaning that for n iterations (max number iterations being defined by this number) will attempt to resolve a position rotation matrix which best represents where the RigidBody should be in the co-ordinate system.
Solver modeRigidBody or Actor.
Set to RigidBody will allow tracking of X, Y, Z position and rotation.
Set to Actor will allow for X, Y, Z position tracking, but no rotation. Both solver modes will technically track from 1+ visible stringers and up, though this is not a great idea for rigid bodies in most cases. In case of RigidBody solver multiple stringers are required to be seen to calculate rotation. Actor solver will average incoming and reference positions and calculate the difference on each update. We update at 100Hz, as per the refresh rate of the Motive settings.
**Movement Threshold:**Keeps Dynamic Blend from recomputing when set to a non-zero value. This is used when static oscillation on almost static objects becomes an issue.
**Rotation Threshold:**Keeps Dynamic Blend from recomputing when set to a non-zero value.
This is used when static oscillation on almost static objects becomes an issue.
**Points:**Points are where you select a number of fixed points, placed by clicking on verteces of your rigid body models, and tie them to stringers. Simply select the correct points and set offsets where required.
**Tweak:**Tweak is used to more accurately align projection onto your rigid body. Use its offsets on XYZ position and rotation until projection lines up. It should be noted that sometimes you may be better off rescaling your model instead, depending on your results.
**Constraints:**Set these on all axes where you would like to restrict movement. 1 turns on, 0 turns off.
**Tracking lost:**On tracking lost settings allow you three forms of behaviour in case tracking data is lost.:
Do nothing, does nothing.
Reset allows you to define a position that rigid bodies will jump to in case tracking data is lost.
Follow sequencing means will follow any ScreenPositionModule programmed on the timeline as a form of redundancy.