OpenTrackIO Documentation v1.0.1

Overview

OpenTrackIO is a free and open-source protocol designed by the SMPTE RiS-OSVP group that seeks to improve interoperability in Virtual Production and beyond.

Virtual Production (VP) encompasses a range of techniques that use camera and lens tracking systems to generate real-time visual effects (VFX) in a render engine. VP encompasses:

In these Virtual Production examples the camera tracking system sends the pose of the camera, lens modeling and other metadata to a render engine every frame.

In Augmented Reality (AR) setups, this enables the render engine to generate virtual objects from the correct camera position and with correct lens distortions to match the real world camera image. In the In-Camera Visual Effect (ICVFX) example, the tracking data is used to render the correct perspective on the LED wall to create the illusion of depth and with a sense parallax.

In Virtual Production it is critical that the camera capture, the tracking data, and the lens data are synchronized in space and time to accurately reproduce the visual effect. A sample of the OpenTrackIO protocol contains all the required data in the appropriate formats to achieve this.

The OpenTrackIO protocol

This documentation is designed for those producing and consuming tracking data. Components that generate and transmit tracking data are referred to as Producers. Components that receive and act upon tracking data are referred to as Consumers. Multiple Producers and Consumers may coexist on the same network at the same time, and a Producer can send multiple concurrent streams of data. There may also be multiple Consumers of a single Producer's data. In the AR example above, the camera tracking system is the Producer and the render engine is the Consumer.

OpenTrackIO defines the schema of JSON samples that contain a wide range of metadata about the device, its transform(s), associated camera and lens. The full schema is given below and can be downloaded here.

All the fields described should be considered optional by the Consumer (although for high-quality tracking for Virtual Production see the recommended set of fields in the samples below).

OpenTrackIO employs a right-handed coordinate system where the Z-axis points upwards and positive rotations are clockwise around the axis. Y points in the forward camera direction (when pan, tilt and roll are zero). For example, in an LED volume Y would point towards the centre of the LED wall and X would point towards camera-right.

OpenTrackIO employs the OpenLensIO mathematical lens model for the practical application of spherical lens distortion in Virtual Production. See here for conversion mathematics from the OpenCV model.

Software resources

OpenTrackIO's parameters are defined by CamDKit. This repository includes examples for generating and parsing data in python and C++.

A C++ reference implementation of OpenTrackIO is available on Mo-Sys' GitHub and a C++ port of the python parser is provided in CamDKit that demonstrates linkage.

A tool to simulate the mathematical aspects of the OpenTrackIO specification is available in this repository.

It is recommended that metadata samples are transmitted every frame (i.e. to coincide with the video frames from a camera). It provides a snapshot of the status of the tracking system at that instant.

{
  "tracker": {
    "notes": "Example generated sample.",
    "recording": false,
    "slate": "A101_A_4",
    "status": "Optical Good"
  },
  "timing": {
    "mode": "external",
    "sampleRate": {
      "num": 24,
      "denom": 1
    },
    "timecode": {
      "hours": 1,
      "minutes": 2,
      "seconds": 3,
      "frames": 4,
      "frameRate": {
        "num": 24,
        "denom": 1
      }
    }
  },
  "lens": {
    "distortion": [
      {
        "radial": [
          1.0,
          2.0,
          3.0
        ],
        "tangential": [
          1.0,
          2.0
        ],
        "overscan": 3.1
      }
    ],
    "encoders": {
      "focus": 0.1,
      "iris": 0.2,
      "zoom": 0.3
    },
    "entrancePupilOffset": 0.123,
    "fStop": 4.0,
    "pinholeFocalLength": 24.305,
    "focusDistance": 10.0,
    "projectionOffset": {
      "x": 0.1,
      "y": 0.2
    }
  },
  "protocol": {
    "name": "OpenTrackIO",
    "version": [
      1,
      0,
      1
    ]
  },
  "sampleId": "urn:uuid:e925a0d4-206d-4d57-b418-08b076eac650",
  "sourceId": "urn:uuid:81eb1845-4bdc-4825-8333-e911640ebfdd",
  "sourceNumber": 1,
  "transforms": [
    {
      "translation": {
        "x": 1.0,
        "y": 2.0,
        "z": 3.0
      },
      "rotation": {
        "pan": 180.0,
        "tilt": 90.0,
        "roll": 45.0
      },
      "id": "Camera"
    }
  ]
}

Download

Providing additional static data

It is recommended that a static metadata object is added to a sample approximately every 2 seconds. This additional metadata describes the context of the samples in the source, with data that may change - for example - every take, but will not change every frame.

{
  "static": {
    "camera": {
      "activeSensorPhysicalDimensions": {
        "height": 24.0,
        "width": 36.0
      },
      "label": "A"
    },
    "lens": {
      "make": "LensMaker",
      "model": "Model15"
    }
  },
  "tracker": {
    "notes": "Example generated sample.",
    "recording": false,
    "slate": "A101_A_4",
    "status": "Optical Good"
  },
  "timing": {
    "mode": "external",
    "sampleRate": {
      "num": 24,
      "denom": 1
    },
    "timecode": {
      "hours": 1,
      "minutes": 2,
      "seconds": 3,
      "frames": 4,
      "frameRate": {
        "num": 24,
        "denom": 1
      }
    }
  },
  "lens": {
    "distortion": [
      {
        "radial": [
          1.0,
          2.0,
          3.0
        ],
        "tangential": [
          1.0,
          2.0
        ],
        "overscan": 3.1
      }
    ],
    "encoders": {
      "focus": 0.1,
      "iris": 0.2,
      "zoom": 0.3
    },
    "entrancePupilOffset": 0.123,
    "fStop": 4.0,
    "pinholeFocalLength": 24.305,
    "focusDistance": 10.0,
    "projectionOffset": {
      "x": 0.1,
      "y": 0.2
    }
  },
  "protocol": {
    "name": "OpenTrackIO",
    "version": [
      1,
      0,
      1
    ]
  },
  "sampleId": "urn:uuid:66f7c871-4f24-4f5d-bb94-b0c5a1ad7d06",
  "sourceId": "urn:uuid:1292ee44-593c-4365-a6a7-5cc6e75947c6",
  "sourceNumber": 1,
  "transforms": [
    {
      "translation": {
        "x": 1.0,
        "y": 2.0,
        "z": 3.0
      },
      "rotation": {
        "pan": 180.0,
        "tilt": 90.0,
        "roll": 45.0
      },
      "id": "Camera"
    }
  ]
}

Download

Complete sample

OpenTrackIO defines many more options and fields and these should be parsed where appropriate by the Consumer. Custom fields can also be added as shown (although these will require specific Producer / Consumer negotiation)

{
  "static": {
    "duration": {
      "num": 1,
      "denom": 25
    },
    "camera": {
      "captureFrameRate": {
        "num": 24000,
        "denom": 1001
      },
      "activeSensorPhysicalDimensions": {
        "height": 24.0,
        "width": 36.0
      },
      "activeSensorResolution": {
        "height": 2160,
        "width": 3840
      },
      "make": "CameraMaker",
      "model": "Model20",
      "serialNumber": "1234567890A",
      "firmwareVersion": "1.2.3",
      "label": "A",
      "anamorphicSqueeze": {
        "num": 1,
        "denom": 1
      },
      "isoSpeed": 4000,
      "fdlLink": "urn:uuid:92a2d8d8-4c98-4844-9a97-e32f0ab219f5",
      "shutterAngle": 45.0
    },
    "lens": {
      "distortionOverscanMax": 1.2,
      "undistortionOverscanMax": 1.3,
      "make": "LensMaker",
      "model": "Model15",
      "serialNumber": "1234567890A",
      "nominalFocalLength": 14.0,
      "calibrationHistory": [
        "LensMaker 123",
        "TrackerMaker 123"
      ]
    },
    "tracker": {
      "make": "TrackerMaker",
      "model": "Tracker",
      "serialNumber": "1234567890A",
      "firmwareVersion": "1.2.3"
    }
  },
  "tracker": {
    "notes": "Example generated sample.",
    "recording": false,
    "slate": "A101_A_4",
    "status": "Optical Good"
  },
  "timing": {
    "mode": "internal",
    "recordedTimestamp": {
      "seconds": 1718806000,
      "nanoseconds": 500000000
    },
    "sampleRate": {
      "num": 24,
      "denom": 1
    },
    "sampleTimestamp": {
      "seconds": 1718806554,
      "nanoseconds": 500000000
    },
    "sequenceNumber": 0,
    "synchronization": {
      "locked": true,
      "source": "ptp",
      "frequency": {
        "num": 24000,
        "denom": 1001
      },
      "present": true,
      "ptp": {
        "profile": "SMPTE ST2059-2:2021",
        "domain": 1,
        "leaderIdentity": "00:11:22:33:44:55",
        "leaderPriorities": {
          "priority1": 128,
          "priority2": 128
        },
        "leaderAccuracy": 5e-08,
        "leaderTimeSource": "GNSS",
        "meanPathDelay": 0.000123,
        "vlan": 100
      }
    },
    "timecode": {
      "hours": 1,
      "minutes": 2,
      "seconds": 3,
      "frames": 4,
      "frameRate": {
        "num": 24000,
        "denom": 1001
      },
      "subFrame": 1,
      "dropFrame": true
    }
  },
  "lens": {
    "custom": [
      1.0,
      2.0
    ],
    "distortion": [
      {
        "model": "Brown-Conrady U-D",
        "radial": [
          1.0,
          2.0,
          3.0,
          4.0,
          5.0,
          6.0
        ],
        "tangential": [
          1.0,
          2.0
        ],
        "overscan": 3.0
      },
      {
        "radial": [
          1.0,
          2.0,
          3.0,
          4.0,
          5.0,
          6.0
        ],
        "tangential": [
          1.0,
          2.0
        ],
        "overscan": 2.0
      }
    ],
    "distortionOffset": {
      "x": 1.0,
      "y": 2.0
    },
    "encoders": {
      "focus": 0.1,
      "iris": 0.2,
      "zoom": 0.3
    },
    "entrancePupilOffset": 0.123,
    "exposureFalloff": {
      "a1": 1.0,
      "a2": 2.0,
      "a3": 3.0
    },
    "fStop": 4.0,
    "pinholeFocalLength": 24.305,
    "focusDistance": 10.0,
    "projectionOffset": {
      "x": 0.1,
      "y": 0.2
    },
    "rawEncoders": {
      "focus": 1000,
      "iris": 2000,
      "zoom": 3000
    },
    "tStop": 4.1
  },
  "protocol": {
    "name": "OpenTrackIO",
    "version": [
      1,
      0,
      1
    ]
  },
  "sampleId": "urn:uuid:9c07052e-032c-40d6-9942-ac385f6cb3e7",
  "sourceId": "urn:uuid:57dc3bfc-3fa1-4f53-9f3e-e96ab3a08e80",
  "sourceNumber": 1,
  "relatedSampleIds": [
    "urn:uuid:bb54be8b-8f37-4c48-8258-939e334897f7",
    "urn:uuid:72ed1676-1e61-4617-897c-59334635aa28"
  ],
  "globalStage": {
    "E": 100.0,
    "N": 200.0,
    "U": 300.0,
    "lat0": 100.0,
    "lon0": 200.0,
    "h0": 300.0
  },
  "transforms": [
    {
      "translation": {
        "x": 1.0,
        "y": 2.0,
        "z": 3.0
      },
      "rotation": {
        "pan": 180.0,
        "tilt": 90.0,
        "roll": 45.0
      },
      "id": "Dolly"
    },
    {
      "translation": {
        "x": 1.0,
        "y": 2.0,
        "z": 3.0
      },
      "rotation": {
        "pan": 180.0,
        "tilt": 90.0,
        "roll": 45.0
      },
      "scale": {
        "x": 1.0,
        "y": 2.0,
        "z": 3.0
      },
      "id": "Crane Arm"
    },
    {
      "translation": {
        "x": 1.0,
        "y": 2.0,
        "z": 3.0
      },
      "rotation": {
        "pan": 180.0,
        "tilt": 90.0,
        "roll": 45.0
      },
      "scale": {
        "x": 1.0,
        "y": 2.0,
        "z": 3.0
      },
      "id": "Camera"
    }
  ],
  "custom": {
    "pot1": 2435,
    "button1": false
  }
}

Download

Transport recommendations

UDP

OpenTrackIO typically operates over IPv4 UDP. Support for IPv6 is not included in the current version of OpenTrackIO. When using IPv4 UDP, the below guidelines have been put in place to ensure interoperability between systems. Byte-ordering (or 'network order') is big-endian.

IP Addressing

It is recommended that OpenTrackIO Producers use multicast addressing to deliver messages to Consumers to guarantee interoperability and ease of configuration. The use of unicast addressing is also allowed, but implementation details are currently outside the scope of this document.

Producers should transmit multicast messages according to the addressing scheme in the table below. Allocated multicast addresses are from the IPv4 Local Scope and will be managed by routers in conformance with RFC 2365.

IP Octet 1 IP Octet 2 IP Octet 3 IP Octet 4
239 135 1 Source Number

The Source Number is a user-configurable 8-bit value (1-200) that determines the multicast IP address for a specific Source from a Producer. By embedding the Source Number in the 4th octet of the multicast IP address, this mechanism enables Producers and Consumers to exchange data for a given Source without requiring prior knowledge of the network topology or specific IP addresses.

This design allows:

Example

In the example above, using a Producer with Source Number 14:

The Source Number is a configurable, user-assignable value that is unique within an OpenTrackIO network. It identifies a specific Source of data from a particular Producer. The Source Number must be explicitly configured and should not be inferred from the multicast address. Source Numbers above 200 are reserved for future expansion of the OpenTrackIO protocol and may not be used. Consumers shall discard any messages containing a Source Number of 0 or 201-255.

The default destination UDP port for multicast messages is 55555. Other ports may be used if necessary based on local network requirements.

Consumers must handle identical multicast messages consistently. If a Consumer receives the same message multiple times, it should process only one instance.

Any packet that exceeds the UDP packet length limit (~64kB) should be segmented and noted as such in the header. If a multi-segment payload is incomplete due to one or more segments being lost in transmission, the entire payload should be discarded.

OpenTrackIO is a unidirectional protocol; however, if a Consumer receives a message that requires a response, the reply should be sent via unicast to the source address and port of the Producer from which the message originated. Guidelines for when and how devices should respond are outside the scope of the current version of OpenTrackIO, but may be included in a future version.

Multicast Subscription

Components must implement IGMP V2 or any subsequent version that supports its functionality. This protocol communicates multicast address usage to the network infrastructure, ensuring correct delivery of multicast traffic across large and complex networks.

Packet Header

When using raw UDP or serial transport, each packet should include the header below:

Bit Offset Field Description
0-31 Identifier 4 bytes: Static value to indicate OpenTrackIO packet, set to ASCII "OTrk" (0x4F54726B)
32-39 Reserved 1 byte: This field is reserved for future use and should be ignored by both Producers and Consumers.
40-47 Encoding 1 byte: Indicates the payload format (e.g., JSON = 0x01, CBOR = 0x02, OTP = 0x02). 0x80 and above are reserved for vendor specific protocols.
48-63 Sequence number 2 bytes: A 16-bit unsigned integer indicating the OpenTrackIO packet's unique sequence number (0x00 to UINT16)
64-95 Segment offset 4 bytes: A 32-bit field indicating the byte offset of this payload segment when the overall payload length necessitates segmentation. Must be set to 0x00 for single-segment payloads.
96 Last segment flag This bit shall be set to 1 if this is the only segment or the last segment in a segmented payload, or 0 if more segments are expected.
97-111 Payload Length 15 bits: Total length of the payload for the current packet (in bytes).
112-127 Checksum (Fletcher16) 2 bytes: A 16-bit checksum computed using the Fletcher-16 algorithm with a modulus of 256, covering the header (excluding checksum bytes) and payload.
128+ Payload The actual JSON or CBOR OpenTrackIO packet (or a segment thereof) data starts here

uint16_t fletcher (const uint8_t* data, uint16_t len)
{
    if (!data) return 0;
    uint8_t sum1 = 0;
    uint8_t sum2 = 0;
    while (len-- > 0) {
        sum1 += *data++;
        sum2 += sum1;
    }
    return ((uint16_t) sum2 << 8) | (uint16_t) sum1;
}
              

Description of all fields

ParameterSectionSamplingDescriptionUnitsConstraints
durationNoneStaticDuration of the clipsecondThe parameter shall be a rational number whose numerator is in the range [0..2,147,483,647] and denominator in the range (0..4,294,967,295].
captureFrameRatecameraStaticCapture frame rate of the camerahertzThe parameter shall be a rational number whose numerator is in the range [0..2,147,483,647] and denominator in the range (0..4,294,967,295].
activeSensorPhysicalDimensionscameraStaticHeight and width of the active area of the camera sensor in millimetersmillimeterThe height and width shall be each be real non-negative numbers.
activeSensorResolutioncameraStaticPhotosite resolution of the active area of the camera sensor in pixelspixelThe height and width shall be each be an integer in the range [0..2,147,483,647].
makecameraStaticNon-blank string naming camera manufacturerNoneThe parameter shall be a Unicode string between 0 and 1023 codepoints.
modelcameraStaticNon-blank string identifying camera modelNoneThe parameter shall be a Unicode string between 0 and 1023 codepoints.
serialNumbercameraStaticNon-blank string uniquely identifying the cameraNoneThe parameter shall be a Unicode string between 0 and 1023 codepoints.
firmwareVersioncameraStaticNon-blank string identifying camera firmware versionNoneThe parameter shall be a Unicode string between 0 and 1023 codepoints.
labelcameraStaticNon-blank string containing user-determined camera identifierNoneThe parameter shall be a Unicode string between 0 and 1023 codepoints.
anamorphicSqueezecameraStaticNominal ratio of height to width of the image of an axis-aligned square captured by the camera sensor. It can be used to de-squeeze images but is not however an exact number over the entire captured area due to a lens' intrinsic analog nature. NoneThe parameter shall be a rational number whose numerator is in the range [0..2,147,483,647] and denominator in the range (0..4,294,967,295].
isoSpeedcameraStaticArithmetic ISO scale as defined in ISO 12232NoneThe parameter shall be a integer in the range (1..4,294,967,295].
fdlLinkcameraStaticURN identifying the ASC Framing Decision List used by the camera.NoneThe parameter shall be a UUID URN as specified in IETF RFC 4122. Only lowercase characters shall be used. Example: `f81d4fae-7dec-11d0-a765-00a0c91e6bf6`
shutterAnglecameraStaticShutter speed as a fraction of the capture frame rate. The shutter speed (in units of 1/s) is equal to the value of the parameter divided by 360 times the capture frame rate. degreeThe parameter shall be a real number in the range (0..360].
distortionOverscanMaxlensStaticStatic maximum overscan factor on lens distortion. This is an alternative to providing dynamic overscan values each frame. Note it should be the maximum of both projection-matrix-based and field-of-view-based rendering as per the OpenLensIO documentation. NoneThe parameter shall be a real number >= 1.
undistortionOverscanMaxlensStaticStatic maximum overscan factor on lens undistortion. This is an alternative to providing dynamic overscan values each frame. Note it should be the maximum of both projection-matrix-based and field-of-view-based rendering as per the OpenLensIO documentation. NoneThe parameter shall be a real number >= 1.
makelensStaticNon-blank string naming lens manufacturerNoneThe parameter shall be a Unicode string between 0 and 1023 codepoints.
modellensStaticNon-blank string identifying lens modelNoneThe parameter shall be a Unicode string between 0 and 1023 codepoints.
serialNumberlensStaticNon-blank string uniquely identifying the lensNoneThe parameter shall be a Unicode string between 0 and 1023 codepoints.
firmwareVersionlensStaticNon-blank string identifying lens firmware versionNoneThe parameter shall be a Unicode string between 0 and 1023 codepoints.
nominalFocalLengthlensStaticNominal focal length of the lens. The number printed on the side of a prime lens, e.g. 50 mm, and undefined in the case of a zoom lens. millimeterThe parameter shall be a real number greater than 0.
calibrationHistorylensStaticList of free strings that describe the history of calibrations of the lens.NoneNone
maketrackerStaticNon-blank string naming tracking device manufacturerNoneThe parameter shall be a Unicode string between 0 and 1023 codepoints.
modeltrackerStaticNon-blank string identifying tracking device modelNoneThe parameter shall be a Unicode string between 0 and 1023 codepoints.
serialNumbertrackerStaticNon-blank string uniquely identifying the tracking deviceNoneThe parameter shall be a Unicode string between 0 and 1023 codepoints.
firmwareVersiontrackerStaticNon-blank string identifying tracking device firmware versionNoneThe parameter shall be a Unicode string between 0 and 1023 codepoints.
notestrackerRegularNon-blank string containing notes about tracking systemNoneThe parameter shall be a Unicode string between 0 and 1023 codepoints.
recordingtrackerRegularBoolean indicating whether tracking system is recording dataNoneThe parameter shall be a boolean.
slatetrackerRegularNon-blank string describing the recording slateNoneThe parameter shall be a Unicode string between 0 and 1023 codepoints.
statustrackerRegularNon-blank string describing status of tracking systemNoneThe parameter shall be a Unicode string between 0 and 1023 codepoints.
modetimingRegularEnumerated value indicating whether the sample transport mechanism provides inherent ('external') timing, or whether the transport mechanism lacks inherent timing and so the sample must contain a PTP timestamp itself ('internal') to carry timing information. NoneThe parameter shall be one of the allowed values.
recordedTimestamptimingRegularPTP timestamp of the data recording instant, provided for convenience during playback of e.g. pre-recorded tracking data. The timestamp comprises a 48-bit unsigned integer (seconds), a 32-bit unsigned integer (nanoseconds) secondThe parameter shall contain valid number of seconds, nanoseconds elapsed since the start of the epoch.
sampleRatetimingRegularSample frame rate as a rational number. Drop frame rates such as 29.97 should be represented as e.g. 30000/1001. In a variable rate system this should be estimated from the last sample delta time. NoneThe parameter shall be a rational number whose numerator is in the range [0..2,147,483,647] and denominator in the range (0..4,294,967,295].
sampleTimestamptimingRegularPTP timestamp of the data capture instant. Note this may differ from the packet's transmission PTP timestamp. The timestamp comprises a 48-bit unsigned integer (seconds), a 32-bit unsigned integer (nanoseconds) secondThe parameter shall contain valid number of seconds, nanoseconds elapsed since the start of the epoch.
sequenceNumbertimingRegularInteger incrementing with each sample.NoneThe parameter shall be a integer in the range (0..4,294,967,295].
synchronizationtimingRegularObject describing how the tracking device is synchronized for this sample. frequency: The frequency of a synchronization signal.This may differ from the sample frame rate for example in a genlocked tracking device. This is not required if the synchronization source is PTP or NTP. locked: Is the tracking device locked to the synchronization source offsets: Offsets in seconds between sync and sample. Critical for e.g. frame remapping, or when using different data sources for position/rotation and lens encoding present: Is the synchronization source present (a synchronization source can be present but not locked if frame rates differ for example) ptp: If the synchronization source is a PTP leader, then this object contains: - "profile": Specifies the PTP profile in use. This defines the operational rules and parameters for synchronization. For example "SMPTE ST2059-2:2021" for SMPTE 2110 based systems, or "IEEE Std 1588-2019" or "IEEE Std 802.1AS-2020" for industrial applications - "domain": Identifies the PTP domain the device belongs to. Devices in the same domain can synchronize with each other - "leaderIdentity": The unique identifier (usually MAC address) of the current PTP leader (grandmaster) - "leaderPriorities": The priority values of the leader used in the Best Master Clock Algorithm (BMCA). Lower values indicate higher priority - "priority1": Static priority set by the administrator - "priority2": Dynamic priority based on the leader's role or clock quality - "leaderAccuracy": The timing offset in seconds from the sample timestamp to the PTP timestamp - "meanPathDelay": The average round-trip delay between the device and the PTP leader, measured in seconds source: The source of synchronization must be defined as one of the following: - "vlan": Integer representing the VLAN ID for PTP traffic (e.g., 100 for VLAN 100) - "leaderTimeSource": Indicates the leader's source of time, such as GNSS, atomic clock, or NTP - "genlock": The tracking device has an external black/burst or tri-level analog sync signal that is triggering the capture of tracking samples - "videoIn": The tracking device has an external video signal that is triggering the capture of tracking samples - "ptp": The tracking device is locked to a PTP leader - "ntp": The tracking device is locked to an NTP server NoneThe parameter shall contain the required valid fields.
timecodetimingRegularSMPTE timecode of the sample. Timecode is a standard for labeling individual frames of data in media systems and is useful for inter-frame synchronization. Frame rate is a rational number, allowing drop frame rates such as that colloquially called 29.97 to be represented exactly, as 30000/1001. The timecode frame rate may differ from the sample frequency. The zero-based sub-frame field allows for finer division of the frame, e.g. interlaced frames have two sub-frames, one per field. NoneThe parameter shall contain a valid format and hours, minutes, seconds and frames with appropriate min/max values.
customlensRegularThis list provides optional additional custom coefficients that can extend the existing lens model. The meaning of and how these characteristics are to be applied to a virtual camera would require negotiation between a particular producer and consumer. NoneThe parameter shall be a tuple of items of the class itemClass. The tuple can be empty
distortionlensRegularA list of Distortion objects that each define the coefficients for calculating the distortion characteristics of a lens comprising radial distortion coefficients of the spherical distortion (k1-N) and (optionally) the tangential distortion (p1-N). The key 'model' names the distortion model. Typical values for 'model' include "Brown-Conrady D-U" when mapping distorted to undistorted coordinates, and "Brown-Conrady U-D" when mapping undistorted to undistorted coordinates. If not provided, the default model is "Brown-Conrady D-U". NoneThe list shall contain at least one Distortion object, and in each object the radial and tangential coefficients shall each be real numbers.
distortionOffsetlensRegularOffset in x and y of the centre of distortion of the virtual cameramillimeterX and Y centre shift shall each be real numbers.
encoderslensRegularNormalised real numbers (0-1) for focus, iris and zoom. Encoders are represented in this way (as opposed to raw integer values) to ensure values remain independent of encoder resolution, minimum and maximum (at an acceptable loss of precision). These values are only relevant in lenses with end-stops that demarcate the 0 and 1 range. Value should be provided in the following directions (if known): Focus: 0=infinite 1=closest Iris: 0=open 1=closed Zoom: 0=wide angle 1=telephoto None The parameter shall contain at least one normalised values (0..1) for the FIZ encoders.
entrancePupilOffsetlensRegularOffset of the entrance pupil relative to the nominal imaging plane (positive if the entrance pupil is located on the side of the nominal imaging plane that is towards the object, and negative otherwise). Measured in meters as in a render engine it is often applied in the virtual camera's transform chain. meterThe parameter shall be a real number.
exposureFallofflensRegularCoefficients for calculating the exposure fall-off (vignetting) of a lens NoneThe coefficients shall each be real numbers.
fStoplensRegularThe linear f-number of the lens, equal to the focal length divided by the diameter of the entrance pupil. NoneThe parameter shall be a real number greater than 0.
pinholeFocalLengthlensRegularDistance between the pinhole and the image plane in the simple CGI pinhole camera model.millimeterThe parameter shall be a real number greater than 0.
focusDistancelensRegularFocus distance/position of the lensmeterThe parameter shall be a real number greater than 0.
projectionOffsetlensRegularOffset in x and y of the centre of perspective projection of the virtual camera millimeterX and Y projection offset shall each be real numbers.
rawEncoderslensRegularRaw encoder values for focus, iris and zoom. These values are dependent on encoder resolution and before any homing / ranging has taken place. None The parameter shall contain at least one integer value for the FIZ encoders.
tStoplensRegularLinear t-number of the lens, equal to the F-number of the lens divided by the square root of the transmittance of the lens. NoneThe parameter shall be a real number greater than 0.
protocolNoneRegularName of the protocol in which the sample is being employed, and version of that protocol NoneProtocol name is nonblank string; protocol version is basic x.y.z semantic versioning string
sampleIdNoneRegularURN serving as unique identifier of the sample in which data is being transported. NoneThe parameter shall be a UUID URN as specified in IETF RFC 4122. Only lowercase characters shall be used. Example: `f81d4fae-7dec-11d0-a765-00a0c91e6bf6`
sourceIdNoneRegularURN serving as unique identifier of the source from which data is being transported. NoneThe parameter shall be a UUID URN as specified in IETF RFC 4122. Only lowercase characters shall be used. Example: `f81d4fae-7dec-11d0-a765-00a0c91e6bf6`
sourceNumberNoneRegularNumber that identifies the index of the stream from a source from which data is being transported. This is most important in the case where a source is producing multiple streams of samples. NoneThe parameter shall be a integer in the range (0..4,294,967,295].
relatedSampleIdsNoneRegularList of sampleId properties of samples related to this sample. The existence of a sample with a given sampleId is not guaranteed. NoneThe parameter shall be a tuple of items of the class itemClass. The tuple can be empty
globalStageNoneRegularPosition of stage origin in global ENU and geodetic coordinates (E, N, U, lat0, lon0, h0). Note this may be dynamic if the stage is inside a moving vehicle. meterEach field in the GlobalPosition shall be a real number
transformsNoneRegularA list of transforms. Transforms are composed in sequential order, starting with the first transform in the list and concluding with the last transform in the list. The compound transform contains the position (in meters) and orientation (in degrees) of the camera sensor relative to stage origin. The Z axis points upwards and the coordinate system is right-handed. Y points in the forward camera direction (when pan, tilt and roll are zero). For example in an LED volume Y would point towards the centre of the LED wall and so X would point to camera-right. Rotation expressed as euler angles in degrees of the camera sensor relative to stage origin Rotations are intrinsic and are measured around the axes ZXY, commonly referred to as [pan, tilt, roll] Notes on Euler angles: Euler angles are human readable and unlike quarternions, provide the ability for cycles (with angles >360 or <0 degrees). Where a tracking system is providing the pose of a virtual camera, gimbal lock does not present the physical challenges of a robotic system. Conversion to and from quarternions is trivial with an acceptable loss of precision. meter / degreeEach component of each transform shall contain Real numbers.

JSON schema

This JSON Schema can be used to validate OpenTrackIO samples

{
  "$id": "https://opentrackio.org/schema.json",
  "$schema": "https://json-schema.org/draft/2020-12/schema",
  "type": "object",
  "properties": {
    "static": {
      "type": "object",
      "properties": {
        "duration": {
          "type": "object",
          "properties": {
            "num": {
              "type": "integer",
              "maximum": 2147483647,
              "minimum": 1
            },
            "denom": {
              "type": "integer",
              "maximum": 4294967295,
              "minimum": 1
            }
          },
          "required": [
            "num",
            "denom"
          ],
          "additionalProperties": false,
          "description": "Duration of the clip",
          "units": "second"
        },
        "camera": {
          "type": "object",
          "properties": {
            "captureFrameRate": {
              "type": "object",
              "properties": {
                "num": {
                  "type": "integer",
                  "maximum": 2147483647,
                  "minimum": 1
                },
                "denom": {
                  "type": "integer",
                  "maximum": 4294967295,
                  "minimum": 1
                }
              },
              "required": [
                "num",
                "denom"
              ],
              "additionalProperties": false,
              "description": "Capture frame rate of the camera",
              "units": "hertz"
            },
            "activeSensorPhysicalDimensions": {
              "type": "object",
              "properties": {
                "height": {
                  "type": "number",
                  "minimum": 0.0
                },
                "width": {
                  "type": "number",
                  "minimum": 0.0
                }
              },
              "required": [
                "height",
                "width"
              ],
              "description": "Height and width of the active area of the camera sensor in millimeters",
              "additionalProperties": false,
              "units": "millimeter"
            },
            "activeSensorResolution": {
              "type": "object",
              "properties": {
                "height": {
                  "type": "integer",
                  "maximum": 2147483647,
                  "minimum": 0
                },
                "width": {
                  "type": "integer",
                  "maximum": 2147483647,
                  "minimum": 0
                }
              },
              "required": [
                "height",
                "width"
              ],
              "description": "Photosite resolution of the active area of the camera sensor in pixels",
              "additionalProperties": false,
              "units": "pixel"
            },
            "make": {
              "type": "string",
              "minLength": 1,
              "maxLength": 1023,
              "description": "Non-blank string naming camera manufacturer"
            },
            "model": {
              "type": "string",
              "minLength": 1,
              "maxLength": 1023,
              "description": "Non-blank string identifying camera model"
            },
            "serialNumber": {
              "type": "string",
              "minLength": 1,
              "maxLength": 1023,
              "description": "Non-blank string uniquely identifying the camera"
            },
            "firmwareVersion": {
              "type": "string",
              "minLength": 1,
              "maxLength": 1023,
              "description": "Non-blank string identifying camera firmware version"
            },
            "label": {
              "type": "string",
              "minLength": 1,
              "maxLength": 1023,
              "description": "Non-blank string containing user-determined camera identifier"
            },
            "anamorphicSqueeze": {
              "type": "object",
              "properties": {
                "num": {
                  "type": "integer",
                  "maximum": 2147483647,
                  "minimum": 1
                },
                "denom": {
                  "type": "integer",
                  "maximum": 4294967295,
                  "minimum": 1
                }
              },
              "required": [
                "num",
                "denom"
              ],
              "additionalProperties": false,
              "description": "Nominal ratio of height to width of the image of an axis-aligned\nsquare captured by the camera sensor. It can be used to de-squeeze\nimages but is not however an exact number over the entire captured\narea due to a lens' intrinsic analog nature.\n"
            },
            "isoSpeed": {
              "type": "integer",
              "maximum": 4294967295,
              "minimum": 1,
              "description": "Arithmetic ISO scale as defined in ISO 12232"
            },
            "fdlLink": {
              "type": "string",
              "pattern": "^urn:uuid:[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$",
              "description": "URN identifying the ASC Framing Decision List used by the camera."
            },
            "shutterAngle": {
              "type": "number",
              "maximum": 360.0,
              "minimum": 0.0,
              "description": "Shutter speed as a fraction of the capture frame rate. The shutter\nspeed (in units of 1/s) is equal to the value of the parameter divided\nby 360 times the capture frame rate.\n",
              "units": "degree"
            }
          },
          "additionalProperties": false
        },
        "lens": {
          "type": "object",
          "properties": {
            "distortionOverscanMax": {
              "type": "number",
              "minimum": 1.0,
              "description": "Static maximum overscan factor on lens distortion. This is an\nalternative to providing dynamic overscan values each frame. Note it\nshould be the maximum of both projection-matrix-based and\nfield-of-view-based rendering as per the OpenLensIO documentation.\n"
            },
            "undistortionOverscanMax": {
              "type": "number",
              "minimum": 1.0,
              "description": "Static maximum overscan factor on lens undistortion. This is an\nalternative to providing dynamic overscan values each frame. Note it\nshould be the maximum of both projection-matrix-based and\nfield-of-view-based rendering as per the OpenLensIO documentation.\n"
            },
            "make": {
              "type": "string",
              "minLength": 1,
              "maxLength": 1023,
              "description": "Non-blank string naming lens manufacturer"
            },
            "model": {
              "type": "string",
              "minLength": 1,
              "maxLength": 1023,
              "description": "Non-blank string identifying lens model"
            },
            "serialNumber": {
              "type": "string",
              "minLength": 1,
              "maxLength": 1023,
              "description": "Non-blank string uniquely identifying the lens"
            },
            "firmwareVersion": {
              "type": "string",
              "minLength": 1,
              "maxLength": 1023,
              "description": "Non-blank string identifying lens firmware version"
            },
            "nominalFocalLength": {
              "type": "number",
              "exclusiveMinimum": 0.0,
              "description": "Nominal focal length of the lens. The number printed on the side\nof a prime lens, e.g. 50 mm, and undefined in the case of a zoom lens.\n",
              "units": "millimeter"
            },
            "calibrationHistory": {
              "type": "array",
              "description": "List of free strings that describe the history of calibrations of the lens.",
              "items": {
                "type": "string",
                "minLength": 1,
                "maxLength": 1023
              }
            }
          },
          "additionalProperties": false
        },
        "tracker": {
          "type": "object",
          "properties": {
            "make": {
              "type": "string",
              "minLength": 1,
              "maxLength": 1023,
              "description": "Non-blank string naming tracking device manufacturer"
            },
            "model": {
              "type": "string",
              "minLength": 1,
              "maxLength": 1023,
              "description": "Non-blank string identifying tracking device model"
            },
            "serialNumber": {
              "type": "string",
              "minLength": 1,
              "maxLength": 1023,
              "description": "Non-blank string uniquely identifying the tracking device"
            },
            "firmwareVersion": {
              "type": "string",
              "minLength": 1,
              "maxLength": 1023,
              "description": "Non-blank string identifying tracking device firmware version"
            }
          },
          "additionalProperties": false
        }
      },
      "additionalProperties": false
    },
    "tracker": {
      "type": "object",
      "properties": {
        "notes": {
          "type": "string",
          "minLength": 1,
          "maxLength": 1023,
          "description": "Non-blank string containing notes about tracking system"
        },
        "recording": {
          "type": "boolean",
          "description": "Boolean indicating whether tracking system is recording data"
        },
        "slate": {
          "type": "string",
          "minLength": 1,
          "maxLength": 1023,
          "description": "Non-blank string describing the recording slate"
        },
        "status": {
          "type": "string",
          "minLength": 1,
          "maxLength": 1023,
          "description": "Non-blank string describing status of tracking system"
        }
      },
      "additionalProperties": false
    },
    "timing": {
      "type": "object",
      "properties": {
        "mode": {
          "enum": [
            "internal",
            "external"
          ],
          "type": "string",
          "description": "Enumerated value indicating whether the sample transport mechanism\nprovides inherent ('external') timing, or whether the transport\nmechanism lacks inherent timing and so the sample must contain a PTP\ntimestamp itself ('internal') to carry timing information.\n"
        },
        "recordedTimestamp": {
          "type": "object",
          "properties": {
            "seconds": {
              "type": "integer",
              "maximum": 281474976710655,
              "minimum": 0
            },
            "nanoseconds": {
              "type": "integer",
              "maximum": 4294967295,
              "minimum": 0
            }
          },
          "required": [
            "seconds",
            "nanoseconds"
          ],
          "additionalProperties": false,
          "units": "second",
          "description": "PTP timestamp of the data recording instant, provided for convenience\nduring playback of e.g. pre-recorded tracking data. The timestamp\ncomprises a 48-bit unsigned integer (seconds), a 32-bit unsigned\ninteger (nanoseconds)\n"
        },
        "sampleRate": {
          "type": "object",
          "properties": {
            "num": {
              "type": "integer",
              "maximum": 2147483647,
              "minimum": 1
            },
            "denom": {
              "type": "integer",
              "maximum": 4294967295,
              "minimum": 1
            }
          },
          "required": [
            "num",
            "denom"
          ],
          "additionalProperties": false,
          "description": "Sample frame rate as a rational number. Drop frame rates such as\n29.97 should be represented as e.g. 30000/1001. In a variable rate\nsystem this should be estimated from the last sample delta time.\n"
        },
        "sampleTimestamp": {
          "type": "object",
          "properties": {
            "seconds": {
              "type": "integer",
              "maximum": 281474976710655,
              "minimum": 0
            },
            "nanoseconds": {
              "type": "integer",
              "maximum": 4294967295,
              "minimum": 0
            }
          },
          "required": [
            "seconds",
            "nanoseconds"
          ],
          "additionalProperties": false,
          "units": "second",
          "description": "PTP timestamp of the data capture instant. Note this may differ\nfrom the packet's transmission PTP timestamp. The timestamp\ncomprises a 48-bit unsigned integer (seconds), a 32-bit unsigned\ninteger (nanoseconds)\n"
        },
        "sequenceNumber": {
          "type": "integer",
          "maximum": 4294967295,
          "minimum": 0,
          "description": "Integer incrementing with each sample."
        },
        "synchronization": {
          "type": "object",
          "properties": {
            "locked": {
              "type": "boolean"
            },
            "source": {
              "enum": [
                "genlock",
                "videoIn",
                "ptp",
                "ntp"
              ],
              "type": "string"
            },
            "frequency": {
              "type": "object",
              "properties": {
                "num": {
                  "type": "integer",
                  "maximum": 2147483647,
                  "minimum": 1
                },
                "denom": {
                  "type": "integer",
                  "maximum": 4294967295,
                  "minimum": 1
                }
              },
              "required": [
                "num",
                "denom"
              ],
              "additionalProperties": false
            },
            "offsets": {
              "type": "object",
              "properties": {
                "translation": {
                  "type": "number"
                },
                "rotation": {
                  "type": "number"
                },
                "lensEncoders": {
                  "type": "number"
                }
              },
              "additionalProperties": false
            },
            "present": {
              "type": "boolean"
            },
            "ptp": {
              "type": "object",
              "properties": {
                "profile": {
                  "enum": [
                    "IEEE Std 1588-2019",
                    "IEEE Std 802.1AS-2020",
                    "SMPTE ST2059-2:2021"
                  ],
                  "type": "string"
                },
                "domain": {
                  "type": "integer",
                  "maximum": 127,
                  "minimum": 0
                },
                "leaderIdentity": {
                  "type": "string",
                  "minLength": 1,
                  "maxLength": 1023,
                  "pattern": "(?:^[0-9a-f]{2}(?::[0-9a-f]{2}){5}$)|(?:^[0-9a-f]{2}(?:-[0-9a-f]{2}){5}$)"
                },
                "leaderPriorities": {
                  "type": "object",
                  "properties": {
                    "priority1": {
                      "type": "integer",
                      "maximum": 255,
                      "minimum": 0
                    },
                    "priority2": {
                      "type": "integer",
                      "maximum": 255,
                      "minimum": 0
                    }
                  },
                  "required": [
                    "priority1",
                    "priority2"
                  ],
                  "description": "Data structure for PTP synchronization priorities",
                  "additionalProperties": false
                },
                "leaderAccuracy": {
                  "type": "number",
                  "minimum": 0.0
                },
                "leaderTimeSource": {
                  "enum": [
                    "GNSS",
                    "Atomic clock",
                    "NTP"
                  ],
                  "type": "string"
                },
                "meanPathDelay": {
                  "type": "number",
                  "minimum": 0.0
                },
                "vlan": {
                  "type": "integer",
                  "maximum": 4294967295,
                  "minimum": 0
                }
              },
              "required": [
                "profile",
                "domain",
                "leaderIdentity",
                "leaderPriorities",
                "leaderAccuracy",
                "meanPathDelay"
              ],
              "additionalProperties": false
            }
          },
          "required": [
            "locked",
            "source"
          ],
          "additionalProperties": false,
          "description": "Object describing how the tracking device is synchronized for this\nsample.\n\nfrequency: The frequency of a synchronization signal.This may differ from\nthe sample frame rate for example in a genlocked tracking device. This is\nnot required if the synchronization source is PTP or NTP.\nlocked: Is the tracking device locked to the synchronization source\noffsets: Offsets in seconds between sync and sample. Critical for e.g.\nframe remapping, or when using different data sources for\nposition/rotation and lens encoding\npresent: Is the synchronization source present (a synchronization\nsource can be present but not locked if frame rates differ for\nexample)\nptp: If the synchronization source is a PTP leader, then this object\ncontains:\n- \"profile\": Specifies the PTP profile in use. This defines the operational\nrules and parameters for synchronization. For example \"SMPTE ST2059-2:2021\"\nfor SMPTE 2110 based systems, or \"IEEE Std 1588-2019\" or\n\"IEEE Std 802.1AS-2020\" for industrial applications\n- \"domain\": Identifies the PTP domain the device belongs to. Devices in the\nsame domain can synchronize with each other\n- \"leaderIdentity\": The unique identifier (usually MAC address) of the\ncurrent PTP leader (grandmaster)\n- \"leaderPriorities\": The priority values of the leader used in the Best\nMaster Clock Algorithm (BMCA). Lower values indicate higher priority\n- \"priority1\": Static priority set by the administrator\n- \"priority2\": Dynamic priority based on the leader's role or clock quality\n- \"leaderAccuracy\": The timing offset in seconds from the sample timestamp\nto the PTP timestamp\n- \"meanPathDelay\": The average round-trip delay between the device and the\nPTP leader, measured in seconds\nsource: The source of synchronization must be defined as one of the\nfollowing:\n- \"vlan\": Integer representing the VLAN ID for PTP traffic (e.g., 100 for\nVLAN 100)\n- \"leaderTimeSource\": Indicates the leader's source of time, such as GNSS, atomic\nclock, or NTP\n- \"genlock\": The tracking device has an external black/burst or\ntri-level analog sync signal that is triggering the capture of\ntracking samples\n- \"videoIn\": The tracking device has an external video signal that is\ntriggering the capture of tracking samples\n- \"ptp\": The tracking device is locked to a PTP leader\n- \"ntp\": The tracking device is locked to an NTP server\n"
        },
        "timecode": {
          "type": "object",
          "properties": {
            "hours": {
              "type": "integer",
              "maximum": 23,
              "minimum": 0
            },
            "minutes": {
              "type": "integer",
              "maximum": 59,
              "minimum": 0
            },
            "seconds": {
              "type": "integer",
              "maximum": 59,
              "minimum": 0
            },
            "frames": {
              "type": "integer",
              "maximum": 119,
              "minimum": 0
            },
            "frameRate": {
              "type": "object",
              "properties": {
                "num": {
                  "type": "integer",
                  "maximum": 2147483647,
                  "minimum": 1
                },
                "denom": {
                  "type": "integer",
                  "maximum": 4294967295,
                  "minimum": 1
                }
              },
              "required": [
                "num",
                "denom"
              ],
              "additionalProperties": false
            },
            "subFrame": {
              "type": "integer",
              "maximum": 4294967295,
              "minimum": 0
            },
            "dropFrame": {
              "type": "boolean"
            }
          },
          "required": [
            "hours",
            "minutes",
            "seconds",
            "frames",
            "frameRate"
          ],
          "description": "SMPTE timecode of the sample. Timecode is a standard for labeling\nindividual frames of data in media systems and is useful for\ninter-frame synchronization. Frame rate is a rational number, allowing\ndrop frame rates such as that colloquially called 29.97 to be\nrepresented exactly, as 30000/1001. The timecode frame rate may differ\nfrom the sample frequency. The zero-based sub-frame field allows for finer\ndivision of the frame, e.g. interlaced frames have two sub-frames,\none per field.\n",
          "additionalProperties": false
        }
      },
      "additionalProperties": false
    },
    "lens": {
      "type": "object",
      "properties": {
        "custom": {
          "type": "array",
          "items": {
            "type": "number"
          },
          "description": "This list provides optional additional custom coefficients that can \nextend the existing lens model. The meaning of and how these characteristics\nare to be applied to a virtual camera would require negotiation between a\nparticular producer and consumer.\n"
        },
        "distortion": {
          "type": "array",
          "items": {
            "type": "object",
            "properties": {
              "model": {
                "type": "string",
                "minLength": 1,
                "maxLength": 1023
              },
              "radial": {
                "type": "array",
                "items": {
                  "type": "number"
                },
                "minItems": 1
              },
              "tangential": {
                "type": "array",
                "items": {
                  "type": "number"
                },
                "minItems": 1
              },
              "overscan": {
                "type": "number",
                "minimum": 1.0,
                "description": "Overscan factor on lens [un]distortion. Overscan may be provided by the\nproducer but can also be overriden or calculated by the consumer. Note\nthis should be the maximum of both projection-matrix-based and field-of-\nview-based rendering as per the OpenLensIO documentation.\n"
              }
            },
            "required": [
              "radial"
            ],
            "additionalProperties": false
          },
          "minItems": 1,
          "description": "A list of Distortion objects that each define the coefficients for\ncalculating the distortion characteristics of a lens comprising radial\ndistortion coefficients of the spherical distortion (k1-N) and \n(optionally) the tangential distortion (p1-N). The key 'model'\nnames the distortion model. Typical values for 'model' include \n\"Brown-Conrady D-U\" when mapping distorted to undistorted coordinates,\nand \"Brown-Conrady U-D\" when mapping undistorted to undistorted\ncoordinates. If not provided, the default model is \"Brown-Conrady D-U\".\n"
        },
        "distortionOffset": {
          "type": "object",
          "properties": {
            "x": {
              "type": "number"
            },
            "y": {
              "type": "number"
            }
          },
          "required": [
            "x",
            "y"
          ],
          "additionalProperties": false,
          "description": "Offset in x and y of the centre of distortion of the virtual camera",
          "units": "millimeter"
        },
        "encoders": {
          "type": "object",
          "properties": {
            "focus": {
              "type": "number",
              "maximum": 1.0,
              "minimum": 0.0
            },
            "iris": {
              "type": "number",
              "maximum": 1.0,
              "minimum": 0.0
            },
            "zoom": {
              "type": "number",
              "maximum": 1.0,
              "minimum": 0.0
            }
          },
          "additionalProperties": false,
          "description": "Normalised real numbers (0-1) for focus, iris and zoom.\nEncoders are represented in this way (as opposed to raw integer\nvalues) to ensure values remain independent of encoder resolution,\nminimum and maximum (at an acceptable loss of precision).\nThese values are only relevant in lenses with end-stops that\ndemarcate the 0 and 1 range.\nValue should be provided in the following directions (if known):\nFocus:   0=infinite     1=closest\nIris:    0=open         1=closed\nZoom:    0=wide angle   1=telephoto\n",
          "anyOf": [
            {
              "required": [
                "focus"
              ]
            },
            {
              "required": [
                "iris"
              ]
            },
            {
              "required": [
                "zoom"
              ]
            }
          ]
        },
        "entrancePupilOffset": {
          "type": "number",
          "description": "Offset of the entrance pupil relative to the nominal imaging plane\n(positive if the entrance pupil is located on the side of the nominal\nimaging plane that is towards the object, and negative otherwise).\nMeasured in meters as in a render engine it is often applied in the\nvirtual camera's transform chain.\n",
          "units": "meter"
        },
        "exposureFalloff": {
          "type": "object",
          "properties": {
            "a1": {
              "type": "number"
            },
            "a2": {
              "type": "number"
            },
            "a3": {
              "type": "number"
            }
          },
          "required": [
            "a1"
          ],
          "additionalProperties": false,
          "description": "Coefficients for calculating the exposure fall-off (vignetting) of\na lens\n"
        },
        "fStop": {
          "type": "number",
          "exclusiveMinimum": 0.0,
          "description": "The linear f-number of the lens, equal to the focal length divided\nby the diameter of the entrance pupil.\n"
        },
        "pinholeFocalLength": {
          "type": "number",
          "exclusiveMinimum": 0.0,
          "description": "Distance between the pinhole and the image plane in the simple CGI pinhole camera model.",
          "units": "millimeter"
        },
        "focusDistance": {
          "type": "number",
          "exclusiveMinimum": 0.0,
          "description": "Focus distance/position of the lens",
          "units": "meter"
        },
        "projectionOffset": {
          "type": "object",
          "properties": {
            "x": {
              "type": "number"
            },
            "y": {
              "type": "number"
            }
          },
          "required": [
            "x",
            "y"
          ],
          "additionalProperties": false,
          "description": "Offset in x and y of the centre of perspective projection of the\nvirtual camera\n",
          "units": "millimeter"
        },
        "rawEncoders": {
          "type": "object",
          "properties": {
            "focus": {
              "type": "integer",
              "maximum": 4294967295,
              "minimum": 0
            },
            "iris": {
              "type": "integer",
              "maximum": 4294967295,
              "minimum": 0
            },
            "zoom": {
              "type": "integer",
              "maximum": 4294967295,
              "minimum": 0
            }
          },
          "additionalProperties": false,
          "description": "Raw encoder values for focus, iris and zoom.\nThese values are dependent on encoder resolution and before any\nhoming / ranging has taken place.\n",
          "anyOf": [
            {
              "required": [
                "focus"
              ]
            },
            {
              "required": [
                "iris"
              ]
            },
            {
              "required": [
                "zoom"
              ]
            }
          ]
        },
        "tStop": {
          "type": "number",
          "exclusiveMinimum": 0.0,
          "description": "Linear t-number of the lens, equal to the F-number of the lens\ndivided by the square root of the transmittance of the lens.\n"
        }
      },
      "additionalProperties": false
    },
    "protocol": {
      "type": "object",
      "properties": {
        "name": {
          "type": "string",
          "minLength": 1,
          "maxLength": 1023
        },
        "version": {
          "type": "array",
          "items": {
            "type": "integer",
            "maximum": 9,
            "minimum": 0
          },
          "minItems": 3,
          "maxItems": 3
        }
      },
      "required": [
        "name",
        "version"
      ],
      "additionalProperties": false,
      "description": "Name of the protocol in which the sample is being employed, and\nversion of that protocol\n"
    },
    "sampleId": {
      "type": "string",
      "pattern": "^urn:uuid:[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$",
      "description": "URN serving as unique identifier of the sample in which data is\nbeing transported.\n"
    },
    "sourceId": {
      "type": "string",
      "pattern": "^urn:uuid:[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$",
      "description": "URN serving as unique identifier of the source from which data is\nbeing transported.\n"
    },
    "sourceNumber": {
      "type": "integer",
      "maximum": 4294967295,
      "minimum": 0,
      "description": "Number that identifies the index of the stream from a source from which\ndata is being transported. This is most important in the case where a source\nis producing multiple streams of samples.\n"
    },
    "relatedSampleIds": {
      "type": "array",
      "items": {
        "type": "string",
        "pattern": "^urn:uuid:[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$"
      },
      "description": "List of sampleId properties of samples related to this sample. The\nexistence of a sample with a given sampleId is not guaranteed.\n"
    },
    "globalStage": {
      "type": "object",
      "properties": {
        "E": {
          "type": "number"
        },
        "N": {
          "type": "number"
        },
        "U": {
          "type": "number"
        },
        "lat0": {
          "type": "number"
        },
        "lon0": {
          "type": "number"
        },
        "h0": {
          "type": "number"
        }
      },
      "required": [
        "E",
        "N",
        "U",
        "lat0",
        "lon0",
        "h0"
      ],
      "description": "Position of stage origin in global ENU and geodetic coordinates\n(E, N, U, lat0, lon0, h0). Note this may be dynamic if the stage is\ninside a moving vehicle.\n",
      "additionalProperties": false,
      "units": "meter"
    },
    "transforms": {
      "type": "array",
      "items": {
        "type": "object",
        "properties": {
          "translation": {
            "type": "object",
            "properties": {
              "x": {
                "type": "number"
              },
              "y": {
                "type": "number"
              },
              "z": {
                "type": "number"
              }
            },
            "additionalProperties": false,
            "units": "meter"
          },
          "rotation": {
            "type": "object",
            "properties": {
              "pan": {
                "type": "number"
              },
              "tilt": {
                "type": "number"
              },
              "roll": {
                "type": "number"
              }
            },
            "additionalProperties": false,
            "units": "degree"
          },
          "scale": {
            "type": "object",
            "properties": {
              "x": {
                "type": "number"
              },
              "y": {
                "type": "number"
              },
              "z": {
                "type": "number"
              }
            },
            "additionalProperties": false
          },
          "id": {
            "type": "string",
            "minLength": 1,
            "maxLength": 1023
          }
        },
        "required": [
          "translation",
          "rotation"
        ],
        "additionalProperties": false
      },
      "minItems": 1,
      "description": "A list of transforms.\nTransforms are composed in sequential order, starting with the first\ntransform in the list and concluding with the last transform in the list.\nThe compound transform contains the position (in meters) and orientation\n(in degrees) of the camera sensor relative to stage origin.\nThe Z axis points upwards and the coordinate system is right-handed.\nY points in the forward camera direction (when pan, tilt and roll are\nzero).\nFor example in an LED volume Y would point towards the centre of the\nLED wall and so X would point to camera-right.\nRotation expressed as euler angles in degrees of the camera sensor\nrelative to stage origin\nRotations are intrinsic and are measured around the axes ZXY, commonly\nreferred to as [pan, tilt, roll]\nNotes on Euler angles:\nEuler angles are human readable and unlike quarternions, provide the\nability for cycles (with angles >360 or <0 degrees).\nWhere a tracking system is providing the pose of a virtual camera,\ngimbal lock does not present the physical challenges of a robotic\nsystem.\nConversion to and from quarternions is trivial with an acceptable loss\nof precision.\n",
      "units": "meter / degree",
      "uniqueItems": false
    }
  }
}

Future additions

In the future RIS intends to add support for: