Tracking Hub Administrator Guide
Version 1.6 | Published December 15, 2022 ©
Motion Analysis Integration
Motion Analysis (MA) is handled like any other tracking system in Viz Virtual Studio. There is no hard limit to the number of objects which can be tracked by Tracking Hub. This section provides some information specific for the MA systems.
The CORTEX System
The setup and calibration of the MA Raptor camera system is done by trained Motion Analysis Corporation Engineers. The software to control the cameras and the tracking is called Cortex. Initially, Cortex was developed for Motion Capturing and not for camera tracking (camera tracking functionality was added later). Therefore, most of the camera settings are not immediately obvious. This is especially true when it comes to camera offset fine adjustment and CCD calibration.
CCD Calibration
Cortex calibrates the relative position of the CCD to the target base using an image based method. The camera observes a target, which is moved in several positions. From this data, the software is able to calculate the position and the field of view of the camera. Like all image based methods, this calculation is not 100% accurate and there is always need to fine-tune the pan, tilt and roll angles. The fine-tuning of position and angles is always done in the Cortex system and never in the offset page of the tracking driver.
Sync and FPS (Frames per Second)
Every part of a Virtual Studio must be in sync. This is true for Cortex and the Raptor cameras as well. You should check that MA are connected to the same sync as Visual Studio or if the signals times are moving independently.
Timing
Depending on the timing family of the sync signal the Cortex system must be set to the following settings:
-
50 Hz format: Cortex runs at 150 FPS with three frame reduction.
-
59.94 Hz formats: Cortex runs at 120 FPS with two frame reduction.
These settings guarantee the lowest latency of the cortex system.
Network Connection
The Cortex system communicates through a network with Tracking Hub. This communication is time critical. Every delayed package is worthless for the Virtual Set and results in a jitter.
IMPORTANT! A separate tracking network between the computer running Tracking Hub and the Cortex machine is mandatory.
The use of a managed switch is allowed, when it is possible to define a separate subnet for the tracking connection. If such a switch is not available, the use of a direct connection or a (basic) unmanaged switch for the connection is recommended.
Cortex Precision Limit
All optical tracking systems show a jitter in position and angle. The jitter in the rotation angles is much more visible than position jitter. The Cortex precision limit for angles is below 1/100th of degree. Even though this is quite accurate, the jitter is still visible when zooming completely in on a telephoto lens. It is not recommended to use complete near shots with any optical tracking system. One possibility is to use a mechanical tracking head for those shots.
Zoom and Focus Encoder
The Motion Analysis system uses a Zoom and Focus encoder to allow Viz to calculate the actual Field of View of the Lens. The choice of the encoder is important for the perfect result.
The following external encoders are recommended when no internal digital encoders on the lens are available:
-
MoSys
-
Shotoku
-
EncodaCam
-
Internal digital lens encoders
Motion Capturing
Please see Topology Panel.