OpenTrackIO is a free and open-source protocol designed by the SMPTE RiS-OSVP group that seeks to improve interoperability in Virtual Production and beyond.
Virtual Production (VP) encompasses a range of techniques that use camera and lens tracking systems to generate real-time visual effects (VFX) in a render engine. VP encompasses:
In these Virtual Production examples the camera tracking system sends the pose of the camera, lens modeling and other metadata to a render engine every frame.
In Augmented Reality (AR) setups, this enables the render engine to generate virtual objects from the correct camera position and with correct lens distortions to match the real world camera image. In the In-Camera Visual Effect (ICVFX) example, the tracking data is used to render the correct perspective on the LED wall to create the illusion of depth and with a sense parallax.
In Virtual Production it is critical that the camera capture, the tracking data, and the lens data are synchronized in space and time to accurately reproduce the visual effect. A sample of the OpenTrackIO protocol contains all the required data in the appropriate formats to achieve this.
This documentation is designed for those producing and consuming tracking data. Components that generate and transmit tracking data are referred to as Producers. Components that receive and act upon tracking data are referred to as Consumers. Multiple Producers and Consumers may coexist on the same network at the same time, and a Producer can send multiple concurrent streams of data. There may also be multiple Consumers of a single Producer's data. In the AR example above, the camera tracking system is the Producer and the render engine is the Consumer.
OpenTrackIO defines the schema of JSON samples that contain a wide range of metadata about the device, its transform(s), associated camera and lens. The full schema is given below and can be downloaded here.
All the fields described should be considered optional by the Consumer (although for high-quality tracking for Virtual Production see the recommended set of fields in the samples below).
OpenTrackIO employs a right-handed coordinate system where the Z-axis points upwards and positive rotations are clockwise around the axis. Y points in the forward camera direction (when pan, tilt and roll are zero). For example, in an LED volume Y would point towards the centre of the LED wall and X would point towards camera-right.
A tool to simulate the mathematical aspects of the OpenTrackIO specification is available in this repository.
It is recommended that metadata samples are transmitted every frame (i.e. to coincide with the video frames from a camera). It provides a snapshot of the status of the tracking system at that instant.
It is recommended that a static metadata object is added to a sample approximately every 2 seconds. This additional metadata describes the context of the samples in the source, with data that may change - for example - every take, but will not change every frame.
OpenTrackIO defines many more options and fields and these should be parsed where appropriate by the Consumer. Custom fields can also be added as shown (although these will require specific Producer / Consumer negotiation)
{
"static": {
"duration": {
"num": 1,
"denom": 25
},
"camera": {
"captureFrameRate": {
"num": 24000,
"denom": 1001
},
"activeSensorPhysicalDimensions": {
"height": 24.0,
"width": 36.0
},
"activeSensorResolution": {
"height": 2160,
"width": 3840
},
"make": "CameraMaker",
"model": "Model20",
"serialNumber": "1234567890A",
"firmwareVersion": "1.2.3",
"label": "A",
"anamorphicSqueeze": {
"num": 1,
"denom": 1
},
"isoSpeed": 4000,
"fdlLink": "urn:uuid:92a2d8d8-4c98-4844-9a97-e32f0ab219f5",
"shutterAngle": 45.0
},
"lens": {
"distortionOverscanMax": 1.2,
"undistortionOverscanMax": 1.3,
"make": "LensMaker",
"model": "Model15",
"serialNumber": "1234567890A",
"nominalFocalLength": 14.0,
"calibrationHistory": [
"LensMaker 123",
"TrackerMaker 123"
]
},
"tracker": {
"make": "TrackerMaker",
"model": "Tracker",
"serialNumber": "1234567890A",
"firmwareVersion": "1.2.3"
}
},
"tracker": {
"notes": "Example generated sample.",
"recording": false,
"slate": "A101_A_4",
"status": "Optical Good"
},
"timing": {
"mode": "internal",
"recordedTimestamp": {
"seconds": 1718806000,
"nanoseconds": 500000000
},
"sampleRate": {
"num": 24,
"denom": 1
},
"sampleTimestamp": {
"seconds": 1718806554,
"nanoseconds": 500000000
},
"sequenceNumber": 0,
"synchronization": {
"locked": true,
"source": "ptp",
"frequency": {
"num": 24000,
"denom": 1001
},
"present": true,
"ptp": {
"profile": "SMPTE ST2059-2:2021",
"domain": 1,
"leaderIdentity": "00:11:22:33:44:55",
"leaderPriorities": {
"priority1": 128,
"priority2": 128
},
"leaderAccuracy": 5e-08,
"leaderTimeSource": "GNSS",
"meanPathDelay": 0.000123,
"vlan": 100
}
},
"timecode": {
"hours": 1,
"minutes": 2,
"seconds": 3,
"frames": 4,
"frameRate": {
"num": 24000,
"denom": 1001
},
"subFrame": 1,
"dropFrame": true
}
},
"lens": {
"custom": [
1.0,
2.0
],
"distortion": [
{
"model": "Brown-Conrady U-D",
"radial": [
1.0,
2.0,
3.0,
4.0,
5.0,
6.0
],
"tangential": [
1.0,
2.0
],
"overscan": 3.0
},
{
"radial": [
1.0,
2.0,
3.0,
4.0,
5.0,
6.0
],
"tangential": [
1.0,
2.0
],
"overscan": 2.0
}
],
"distortionOffset": {
"x": 1.0,
"y": 2.0
},
"encoders": {
"focus": 0.1,
"iris": 0.2,
"zoom": 0.3
},
"entrancePupilOffset": 0.123,
"exposureFalloff": {
"a1": 1.0,
"a2": 2.0,
"a3": 3.0
},
"fStop": 4.0,
"pinholeFocalLength": 24.305,
"focusDistance": 10.0,
"projectionOffset": {
"x": 0.1,
"y": 0.2
},
"rawEncoders": {
"focus": 1000,
"iris": 2000,
"zoom": 3000
},
"tStop": 4.1
},
"protocol": {
"name": "OpenTrackIO",
"version": [
1,
0,
1
]
},
"sampleId": "urn:uuid:9c07052e-032c-40d6-9942-ac385f6cb3e7",
"sourceId": "urn:uuid:57dc3bfc-3fa1-4f53-9f3e-e96ab3a08e80",
"sourceNumber": 1,
"relatedSampleIds": [
"urn:uuid:bb54be8b-8f37-4c48-8258-939e334897f7",
"urn:uuid:72ed1676-1e61-4617-897c-59334635aa28"
],
"globalStage": {
"E": 100.0,
"N": 200.0,
"U": 300.0,
"lat0": 100.0,
"lon0": 200.0,
"h0": 300.0
},
"transforms": [
{
"translation": {
"x": 1.0,
"y": 2.0,
"z": 3.0
},
"rotation": {
"pan": 180.0,
"tilt": 90.0,
"roll": 45.0
},
"id": "Dolly"
},
{
"translation": {
"x": 1.0,
"y": 2.0,
"z": 3.0
},
"rotation": {
"pan": 180.0,
"tilt": 90.0,
"roll": 45.0
},
"scale": {
"x": 1.0,
"y": 2.0,
"z": 3.0
},
"id": "Crane Arm"
},
{
"translation": {
"x": 1.0,
"y": 2.0,
"z": 3.0
},
"rotation": {
"pan": 180.0,
"tilt": 90.0,
"roll": 45.0
},
"scale": {
"x": 1.0,
"y": 2.0,
"z": 3.0
},
"id": "Camera"
}
],
"custom": {
"pot1": 2435,
"button1": false
}
}
Download
OpenTrackIO typically operates over IPv4 UDP. Support for IPv6 is not included in the current version of OpenTrackIO. When using IPv4 UDP, the below guidelines have been put in place to ensure interoperability between systems. Byte-ordering (or 'network order') is big-endian.
It is recommended that OpenTrackIO Producers use multicast addressing to deliver messages to Consumers to guarantee interoperability and ease of configuration. The use of unicast addressing is also allowed, but implementation details are currently outside the scope of this document.
Producers should transmit multicast messages according to the addressing scheme in the table below. Allocated multicast addresses are from the IPv4 Local Scope and will be managed by routers in conformance with RFC 2365.
Consumers must handle identical multicast messages consistently. If a Consumer receives the same message multiple times,
it should process only one instance.
Any packet that exceeds the UDP packet length limit (~64kB) should be segmented and noted as such in the header. If a multi-segment payload is incomplete due to one or more segments being lost in transmission, the entire payload should be discarded.
OpenTrackIO is a unidirectional protocol; however, if a Consumer receives a message that requires a response, the reply
should be sent via unicast to the source address and port of the Producer from which the message originated. Guidelines
for when and how devices should respond are outside the scope of the current version of OpenTrackIO, but may be included
in a future version.
When using raw UDP or serial transport, each packet should include the header below:
{
"$id": "https://opentrackio.org/schema.json",
"$schema": "https://json-schema.org/draft/2020-12/schema",
"type": "object",
"properties": {
"static": {
"type": "object",
"properties": {
"duration": {
"type": "object",
"properties": {
"num": {
"type": "integer",
"maximum": 2147483647,
"minimum": 1
},
"denom": {
"type": "integer",
"maximum": 4294967295,
"minimum": 1
}
},
"required": [
"num",
"denom"
],
"additionalProperties": false,
"description": "Duration of the clip",
"units": "second"
},
"camera": {
"type": "object",
"properties": {
"captureFrameRate": {
"type": "object",
"properties": {
"num": {
"type": "integer",
"maximum": 2147483647,
"minimum": 1
},
"denom": {
"type": "integer",
"maximum": 4294967295,
"minimum": 1
}
},
"required": [
"num",
"denom"
],
"additionalProperties": false,
"description": "Capture frame rate of the camera",
"units": "hertz"
},
"activeSensorPhysicalDimensions": {
"type": "object",
"properties": {
"height": {
"type": "number",
"minimum": 0.0
},
"width": {
"type": "number",
"minimum": 0.0
}
},
"required": [
"height",
"width"
],
"description": "Height and width of the active area of the camera sensor in millimeters",
"additionalProperties": false,
"units": "millimeter"
},
"activeSensorResolution": {
"type": "object",
"properties": {
"height": {
"type": "integer",
"maximum": 2147483647,
"minimum": 0
},
"width": {
"type": "integer",
"maximum": 2147483647,
"minimum": 0
}
},
"required": [
"height",
"width"
],
"description": "Photosite resolution of the active area of the camera sensor in pixels",
"additionalProperties": false,
"units": "pixel"
},
"make": {
"type": "string",
"minLength": 1,
"maxLength": 1023,
"description": "Non-blank string naming camera manufacturer"
},
"model": {
"type": "string",
"minLength": 1,
"maxLength": 1023,
"description": "Non-blank string identifying camera model"
},
"serialNumber": {
"type": "string",
"minLength": 1,
"maxLength": 1023,
"description": "Non-blank string uniquely identifying the camera"
},
"firmwareVersion": {
"type": "string",
"minLength": 1,
"maxLength": 1023,
"description": "Non-blank string identifying camera firmware version"
},
"label": {
"type": "string",
"minLength": 1,
"maxLength": 1023,
"description": "Non-blank string containing user-determined camera identifier"
},
"anamorphicSqueeze": {
"type": "object",
"properties": {
"num": {
"type": "integer",
"maximum": 2147483647,
"minimum": 1
},
"denom": {
"type": "integer",
"maximum": 4294967295,
"minimum": 1
}
},
"required": [
"num",
"denom"
],
"additionalProperties": false,
"description": "Nominal ratio of height to width of the image of an axis-aligned\nsquare captured by the camera sensor. It can be used to de-squeeze\nimages but is not however an exact number over the entire captured\narea due to a lens' intrinsic analog nature.\n"
},
"isoSpeed": {
"type": "integer",
"maximum": 4294967295,
"minimum": 1,
"description": "Arithmetic ISO scale as defined in ISO 12232"
},
"fdlLink": {
"type": "string",
"pattern": "^urn:uuid:[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$",
"description": "URN identifying the ASC Framing Decision List used by the camera."
},
"shutterAngle": {
"type": "number",
"maximum": 360.0,
"minimum": 0.0,
"description": "Shutter speed as a fraction of the capture frame rate. The shutter\nspeed (in units of 1/s) is equal to the value of the parameter divided\nby 360 times the capture frame rate.\n",
"units": "degree"
}
},
"additionalProperties": false
},
"lens": {
"type": "object",
"properties": {
"distortionOverscanMax": {
"type": "number",
"minimum": 1.0,
"description": "Static maximum overscan factor on lens distortion. This is an\nalternative to providing dynamic overscan values each frame. Note it\nshould be the maximum of both projection-matrix-based and\nfield-of-view-based rendering as per the OpenLensIO documentation.\n"
},
"undistortionOverscanMax": {
"type": "number",
"minimum": 1.0,
"description": "Static maximum overscan factor on lens undistortion. This is an\nalternative to providing dynamic overscan values each frame. Note it\nshould be the maximum of both projection-matrix-based and\nfield-of-view-based rendering as per the OpenLensIO documentation.\n"
},
"make": {
"type": "string",
"minLength": 1,
"maxLength": 1023,
"description": "Non-blank string naming lens manufacturer"
},
"model": {
"type": "string",
"minLength": 1,
"maxLength": 1023,
"description": "Non-blank string identifying lens model"
},
"serialNumber": {
"type": "string",
"minLength": 1,
"maxLength": 1023,
"description": "Non-blank string uniquely identifying the lens"
},
"firmwareVersion": {
"type": "string",
"minLength": 1,
"maxLength": 1023,
"description": "Non-blank string identifying lens firmware version"
},
"nominalFocalLength": {
"type": "number",
"exclusiveMinimum": 0.0,
"description": "Nominal focal length of the lens. The number printed on the side\nof a prime lens, e.g. 50 mm, and undefined in the case of a zoom lens.\n",
"units": "millimeter"
},
"calibrationHistory": {
"type": "array",
"description": "List of free strings that describe the history of calibrations of the lens.",
"items": {
"type": "string",
"minLength": 1,
"maxLength": 1023
}
}
},
"additionalProperties": false
},
"tracker": {
"type": "object",
"properties": {
"make": {
"type": "string",
"minLength": 1,
"maxLength": 1023,
"description": "Non-blank string naming tracking device manufacturer"
},
"model": {
"type": "string",
"minLength": 1,
"maxLength": 1023,
"description": "Non-blank string identifying tracking device model"
},
"serialNumber": {
"type": "string",
"minLength": 1,
"maxLength": 1023,
"description": "Non-blank string uniquely identifying the tracking device"
},
"firmwareVersion": {
"type": "string",
"minLength": 1,
"maxLength": 1023,
"description": "Non-blank string identifying tracking device firmware version"
}
},
"additionalProperties": false
}
},
"additionalProperties": false
},
"tracker": {
"type": "object",
"properties": {
"notes": {
"type": "string",
"minLength": 1,
"maxLength": 1023,
"description": "Non-blank string containing notes about tracking system"
},
"recording": {
"type": "boolean",
"description": "Boolean indicating whether tracking system is recording data"
},
"slate": {
"type": "string",
"minLength": 1,
"maxLength": 1023,
"description": "Non-blank string describing the recording slate"
},
"status": {
"type": "string",
"minLength": 1,
"maxLength": 1023,
"description": "Non-blank string describing status of tracking system"
}
},
"additionalProperties": false
},
"timing": {
"type": "object",
"properties": {
"mode": {
"enum": [
"internal",
"external"
],
"type": "string",
"description": "Enumerated value indicating whether the sample transport mechanism\nprovides inherent ('external') timing, or whether the transport\nmechanism lacks inherent timing and so the sample must contain a PTP\ntimestamp itself ('internal') to carry timing information.\n"
},
"recordedTimestamp": {
"type": "object",
"properties": {
"seconds": {
"type": "integer",
"maximum": 281474976710655,
"minimum": 0
},
"nanoseconds": {
"type": "integer",
"maximum": 4294967295,
"minimum": 0
}
},
"required": [
"seconds",
"nanoseconds"
],
"additionalProperties": false,
"units": "second",
"description": "PTP timestamp of the data recording instant, provided for convenience\nduring playback of e.g. pre-recorded tracking data. The timestamp\ncomprises a 48-bit unsigned integer (seconds), a 32-bit unsigned\ninteger (nanoseconds)\n"
},
"sampleRate": {
"type": "object",
"properties": {
"num": {
"type": "integer",
"maximum": 2147483647,
"minimum": 1
},
"denom": {
"type": "integer",
"maximum": 4294967295,
"minimum": 1
}
},
"required": [
"num",
"denom"
],
"additionalProperties": false,
"description": "Sample frame rate as a rational number. Drop frame rates such as\n29.97 should be represented as e.g. 30000/1001. In a variable rate\nsystem this should be estimated from the last sample delta time.\n"
},
"sampleTimestamp": {
"type": "object",
"properties": {
"seconds": {
"type": "integer",
"maximum": 281474976710655,
"minimum": 0
},
"nanoseconds": {
"type": "integer",
"maximum": 4294967295,
"minimum": 0
}
},
"required": [
"seconds",
"nanoseconds"
],
"additionalProperties": false,
"units": "second",
"description": "PTP timestamp of the data capture instant. Note this may differ\nfrom the packet's transmission PTP timestamp. The timestamp\ncomprises a 48-bit unsigned integer (seconds), a 32-bit unsigned\ninteger (nanoseconds)\n"
},
"sequenceNumber": {
"type": "integer",
"maximum": 4294967295,
"minimum": 0,
"description": "Integer incrementing with each sample."
},
"synchronization": {
"type": "object",
"properties": {
"locked": {
"type": "boolean"
},
"source": {
"enum": [
"genlock",
"videoIn",
"ptp",
"ntp"
],
"type": "string"
},
"frequency": {
"type": "object",
"properties": {
"num": {
"type": "integer",
"maximum": 2147483647,
"minimum": 1
},
"denom": {
"type": "integer",
"maximum": 4294967295,
"minimum": 1
}
},
"required": [
"num",
"denom"
],
"additionalProperties": false
},
"offsets": {
"type": "object",
"properties": {
"translation": {
"type": "number"
},
"rotation": {
"type": "number"
},
"lensEncoders": {
"type": "number"
}
},
"additionalProperties": false
},
"present": {
"type": "boolean"
},
"ptp": {
"type": "object",
"properties": {
"profile": {
"enum": [
"IEEE Std 1588-2019",
"IEEE Std 802.1AS-2020",
"SMPTE ST2059-2:2021"
],
"type": "string"
},
"domain": {
"type": "integer",
"maximum": 127,
"minimum": 0
},
"leaderIdentity": {
"type": "string",
"minLength": 1,
"maxLength": 1023,
"pattern": "(?:^[0-9a-f]{2}(?::[0-9a-f]{2}){5}$)|(?:^[0-9a-f]{2}(?:-[0-9a-f]{2}){5}$)"
},
"leaderPriorities": {
"type": "object",
"properties": {
"priority1": {
"type": "integer",
"maximum": 255,
"minimum": 0
},
"priority2": {
"type": "integer",
"maximum": 255,
"minimum": 0
}
},
"required": [
"priority1",
"priority2"
],
"description": "Data structure for PTP synchronization priorities",
"additionalProperties": false
},
"leaderAccuracy": {
"type": "number",
"minimum": 0.0
},
"leaderTimeSource": {
"enum": [
"GNSS",
"Atomic clock",
"NTP"
],
"type": "string"
},
"meanPathDelay": {
"type": "number",
"minimum": 0.0
},
"vlan": {
"type": "integer",
"maximum": 4294967295,
"minimum": 0
}
},
"required": [
"profile",
"domain",
"leaderIdentity",
"leaderPriorities",
"leaderAccuracy",
"meanPathDelay"
],
"additionalProperties": false
}
},
"required": [
"locked",
"source"
],
"additionalProperties": false,
"description": "Object describing how the tracking device is synchronized for this\nsample.\n\nfrequency: The frequency of a synchronization signal.This may differ from\nthe sample frame rate for example in a genlocked tracking device. This is\nnot required if the synchronization source is PTP or NTP.\nlocked: Is the tracking device locked to the synchronization source\noffsets: Offsets in seconds between sync and sample. Critical for e.g.\nframe remapping, or when using different data sources for\nposition/rotation and lens encoding\npresent: Is the synchronization source present (a synchronization\nsource can be present but not locked if frame rates differ for\nexample)\nptp: If the synchronization source is a PTP leader, then this object\ncontains:\n- \"profile\": Specifies the PTP profile in use. This defines the operational\nrules and parameters for synchronization. For example \"SMPTE ST2059-2:2021\"\nfor SMPTE 2110 based systems, or \"IEEE Std 1588-2019\" or\n\"IEEE Std 802.1AS-2020\" for industrial applications\n- \"domain\": Identifies the PTP domain the device belongs to. Devices in the\nsame domain can synchronize with each other\n- \"leaderIdentity\": The unique identifier (usually MAC address) of the\ncurrent PTP leader (grandmaster)\n- \"leaderPriorities\": The priority values of the leader used in the Best\nMaster Clock Algorithm (BMCA). Lower values indicate higher priority\n- \"priority1\": Static priority set by the administrator\n- \"priority2\": Dynamic priority based on the leader's role or clock quality\n- \"leaderAccuracy\": The timing offset in seconds from the sample timestamp\nto the PTP timestamp\n- \"meanPathDelay\": The average round-trip delay between the device and the\nPTP leader, measured in seconds\nsource: The source of synchronization must be defined as one of the\nfollowing:\n- \"vlan\": Integer representing the VLAN ID for PTP traffic (e.g., 100 for\nVLAN 100)\n- \"leaderTimeSource\": Indicates the leader's source of time, such as GNSS, atomic\nclock, or NTP\n- \"genlock\": The tracking device has an external black/burst or\ntri-level analog sync signal that is triggering the capture of\ntracking samples\n- \"videoIn\": The tracking device has an external video signal that is\ntriggering the capture of tracking samples\n- \"ptp\": The tracking device is locked to a PTP leader\n- \"ntp\": The tracking device is locked to an NTP server\n"
},
"timecode": {
"type": "object",
"properties": {
"hours": {
"type": "integer",
"maximum": 23,
"minimum": 0
},
"minutes": {
"type": "integer",
"maximum": 59,
"minimum": 0
},
"seconds": {
"type": "integer",
"maximum": 59,
"minimum": 0
},
"frames": {
"type": "integer",
"maximum": 119,
"minimum": 0
},
"frameRate": {
"type": "object",
"properties": {
"num": {
"type": "integer",
"maximum": 2147483647,
"minimum": 1
},
"denom": {
"type": "integer",
"maximum": 4294967295,
"minimum": 1
}
},
"required": [
"num",
"denom"
],
"additionalProperties": false
},
"subFrame": {
"type": "integer",
"maximum": 4294967295,
"minimum": 0
},
"dropFrame": {
"type": "boolean"
}
},
"required": [
"hours",
"minutes",
"seconds",
"frames",
"frameRate"
],
"description": "SMPTE timecode of the sample. Timecode is a standard for labeling\nindividual frames of data in media systems and is useful for\ninter-frame synchronization. Frame rate is a rational number, allowing\ndrop frame rates such as that colloquially called 29.97 to be\nrepresented exactly, as 30000/1001. The timecode frame rate may differ\nfrom the sample frequency. The zero-based sub-frame field allows for finer\ndivision of the frame, e.g. interlaced frames have two sub-frames,\none per field.\n",
"additionalProperties": false
}
},
"additionalProperties": false
},
"lens": {
"type": "object",
"properties": {
"custom": {
"type": "array",
"items": {
"type": "number"
},
"description": "This list provides optional additional custom coefficients that can \nextend the existing lens model. The meaning of and how these characteristics\nare to be applied to a virtual camera would require negotiation between a\nparticular producer and consumer.\n"
},
"distortion": {
"type": "array",
"items": {
"type": "object",
"properties": {
"model": {
"type": "string",
"minLength": 1,
"maxLength": 1023
},
"radial": {
"type": "array",
"items": {
"type": "number"
},
"minItems": 1
},
"tangential": {
"type": "array",
"items": {
"type": "number"
},
"minItems": 1
},
"overscan": {
"type": "number",
"minimum": 1.0,
"description": "Overscan factor on lens [un]distortion. Overscan may be provided by the\nproducer but can also be overriden or calculated by the consumer. Note\nthis should be the maximum of both projection-matrix-based and field-of-\nview-based rendering as per the OpenLensIO documentation.\n"
}
},
"required": [
"radial"
],
"additionalProperties": false
},
"minItems": 1,
"description": "A list of Distortion objects that each define the coefficients for\ncalculating the distortion characteristics of a lens comprising radial\ndistortion coefficients of the spherical distortion (k1-N) and \n(optionally) the tangential distortion (p1-N). The key 'model'\nnames the distortion model. Typical values for 'model' include \n\"Brown-Conrady D-U\" when mapping distorted to undistorted coordinates,\nand \"Brown-Conrady U-D\" when mapping undistorted to undistorted\ncoordinates. If not provided, the default model is \"Brown-Conrady D-U\".\n"
},
"distortionOffset": {
"type": "object",
"properties": {
"x": {
"type": "number"
},
"y": {
"type": "number"
}
},
"required": [
"x",
"y"
],
"additionalProperties": false,
"description": "Offset in x and y of the centre of distortion of the virtual camera",
"units": "millimeter"
},
"encoders": {
"type": "object",
"properties": {
"focus": {
"type": "number",
"maximum": 1.0,
"minimum": 0.0
},
"iris": {
"type": "number",
"maximum": 1.0,
"minimum": 0.0
},
"zoom": {
"type": "number",
"maximum": 1.0,
"minimum": 0.0
}
},
"additionalProperties": false,
"description": "Normalised real numbers (0-1) for focus, iris and zoom.\nEncoders are represented in this way (as opposed to raw integer\nvalues) to ensure values remain independent of encoder resolution,\nminimum and maximum (at an acceptable loss of precision).\nThese values are only relevant in lenses with end-stops that\ndemarcate the 0 and 1 range.\nValue should be provided in the following directions (if known):\nFocus: 0=infinite 1=closest\nIris: 0=open 1=closed\nZoom: 0=wide angle 1=telephoto\n",
"anyOf": [
{
"required": [
"focus"
]
},
{
"required": [
"iris"
]
},
{
"required": [
"zoom"
]
}
]
},
"entrancePupilOffset": {
"type": "number",
"description": "Offset of the entrance pupil relative to the nominal imaging plane\n(positive if the entrance pupil is located on the side of the nominal\nimaging plane that is towards the object, and negative otherwise).\nMeasured in meters as in a render engine it is often applied in the\nvirtual camera's transform chain.\n",
"units": "meter"
},
"exposureFalloff": {
"type": "object",
"properties": {
"a1": {
"type": "number"
},
"a2": {
"type": "number"
},
"a3": {
"type": "number"
}
},
"required": [
"a1"
],
"additionalProperties": false,
"description": "Coefficients for calculating the exposure fall-off (vignetting) of\na lens\n"
},
"fStop": {
"type": "number",
"exclusiveMinimum": 0.0,
"description": "The linear f-number of the lens, equal to the focal length divided\nby the diameter of the entrance pupil.\n"
},
"pinholeFocalLength": {
"type": "number",
"exclusiveMinimum": 0.0,
"description": "Distance between the pinhole and the image plane in the simple CGI pinhole camera model.",
"units": "millimeter"
},
"focusDistance": {
"type": "number",
"exclusiveMinimum": 0.0,
"description": "Focus distance/position of the lens",
"units": "meter"
},
"projectionOffset": {
"type": "object",
"properties": {
"x": {
"type": "number"
},
"y": {
"type": "number"
}
},
"required": [
"x",
"y"
],
"additionalProperties": false,
"description": "Offset in x and y of the centre of perspective projection of the\nvirtual camera\n",
"units": "millimeter"
},
"rawEncoders": {
"type": "object",
"properties": {
"focus": {
"type": "integer",
"maximum": 4294967295,
"minimum": 0
},
"iris": {
"type": "integer",
"maximum": 4294967295,
"minimum": 0
},
"zoom": {
"type": "integer",
"maximum": 4294967295,
"minimum": 0
}
},
"additionalProperties": false,
"description": "Raw encoder values for focus, iris and zoom.\nThese values are dependent on encoder resolution and before any\nhoming / ranging has taken place.\n",
"anyOf": [
{
"required": [
"focus"
]
},
{
"required": [
"iris"
]
},
{
"required": [
"zoom"
]
}
]
},
"tStop": {
"type": "number",
"exclusiveMinimum": 0.0,
"description": "Linear t-number of the lens, equal to the F-number of the lens\ndivided by the square root of the transmittance of the lens.\n"
}
},
"additionalProperties": false
},
"protocol": {
"type": "object",
"properties": {
"name": {
"type": "string",
"minLength": 1,
"maxLength": 1023
},
"version": {
"type": "array",
"items": {
"type": "integer",
"maximum": 9,
"minimum": 0
},
"minItems": 3,
"maxItems": 3
}
},
"required": [
"name",
"version"
],
"additionalProperties": false,
"description": "Name of the protocol in which the sample is being employed, and\nversion of that protocol\n"
},
"sampleId": {
"type": "string",
"pattern": "^urn:uuid:[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$",
"description": "URN serving as unique identifier of the sample in which data is\nbeing transported.\n"
},
"sourceId": {
"type": "string",
"pattern": "^urn:uuid:[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$",
"description": "URN serving as unique identifier of the source from which data is\nbeing transported.\n"
},
"sourceNumber": {
"type": "integer",
"maximum": 4294967295,
"minimum": 0,
"description": "Number that identifies the index of the stream from a source from which\ndata is being transported. This is most important in the case where a source\nis producing multiple streams of samples.\n"
},
"relatedSampleIds": {
"type": "array",
"items": {
"type": "string",
"pattern": "^urn:uuid:[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$"
},
"description": "List of sampleId properties of samples related to this sample. The\nexistence of a sample with a given sampleId is not guaranteed.\n"
},
"globalStage": {
"type": "object",
"properties": {
"E": {
"type": "number"
},
"N": {
"type": "number"
},
"U": {
"type": "number"
},
"lat0": {
"type": "number"
},
"lon0": {
"type": "number"
},
"h0": {
"type": "number"
}
},
"required": [
"E",
"N",
"U",
"lat0",
"lon0",
"h0"
],
"description": "Position of stage origin in global ENU and geodetic coordinates\n(E, N, U, lat0, lon0, h0). Note this may be dynamic if the stage is\ninside a moving vehicle.\n",
"additionalProperties": false,
"units": "meter"
},
"transforms": {
"type": "array",
"items": {
"type": "object",
"properties": {
"translation": {
"type": "object",
"properties": {
"x": {
"type": "number"
},
"y": {
"type": "number"
},
"z": {
"type": "number"
}
},
"additionalProperties": false,
"units": "meter"
},
"rotation": {
"type": "object",
"properties": {
"pan": {
"type": "number"
},
"tilt": {
"type": "number"
},
"roll": {
"type": "number"
}
},
"additionalProperties": false,
"units": "degree"
},
"scale": {
"type": "object",
"properties": {
"x": {
"type": "number"
},
"y": {
"type": "number"
},
"z": {
"type": "number"
}
},
"additionalProperties": false
},
"id": {
"type": "string",
"minLength": 1,
"maxLength": 1023
}
},
"required": [
"translation",
"rotation"
],
"additionalProperties": false
},
"minItems": 1,
"description": "A list of transforms.\nTransforms are composed in sequential order, starting with the first\ntransform in the list and concluding with the last transform in the list.\nThe compound transform contains the position (in meters) and orientation\n(in degrees) of the camera sensor relative to stage origin.\nThe Z axis points upwards and the coordinate system is right-handed.\nY points in the forward camera direction (when pan, tilt and roll are\nzero).\nFor example in an LED volume Y would point towards the centre of the\nLED wall and so X would point to camera-right.\nRotation expressed as euler angles in degrees of the camera sensor\nrelative to stage origin\nRotations are intrinsic and are measured around the axes ZXY, commonly\nreferred to as [pan, tilt, roll]\nNotes on Euler angles:\nEuler angles are human readable and unlike quarternions, provide the\nability for cycles (with angles >360 or <0 degrees).\nWhere a tracking system is providing the pose of a virtual camera,\ngimbal lock does not present the physical challenges of a robotic\nsystem.\nConversion to and from quarternions is trivial with an acceptable loss\nof precision.\n",
"units": "meter / degree",
"uniqueItems": false
}
}
}