Radiant metrics

A better way to calibrate cameras

No need to dance with checkerboard targets, no need to deal with mysterious throwaway calibration scripts. You just need a computer with a flat screen and a bit of patience.

Calibration results can be used in OpenCV routines or in any application that adopts the same camera model.

More info
No print

No need to print or purchase special calibration targets

Use any flat LCD display as a dynamic high-precision target.

Modern displays provide the localization uncertainty of view rays down to tens of microns, which is 10-100 times better than the dimensional stability of typical static targets.

No code

No need to write any auxiliary programs or download any software

All the heavy-duty data processing runs on our servers.

We use the most advanced algorithms that originate from the field of optical metrology, and leverage optimized GPU-based implementations.

Calibration

Active target techniques (ATTs) ensure fundamentally better-quality data

All calibration methods rely on «landmarks»: points that are recognized in the images and whose positions in the 3D world are known. With static patterns, typical datasets contain at most a few dozen such points.

ATTs developed for the most demanding measurements, turn every pixel on the screen into a «landmark». Dense datasets produced by ATTs may contain millions of decoded points, which leads to significantly more robust and reliable calibration.

Moreover, our decoding algorithms also estimate the position uncertainties for each pixel, which is essential for the assessment of calibration quality.

Chart

Advanced analytics and quality assurance for calibration outcomes

Along with the model parameters, we generate exhaustive graphics and reports that characterize the consistency, the reliability, and the reproducibility of the calibration results.

The residual model uncertainty is projected to the 3D world and quantified in meters rather than in pixels: with that, one can easily decide if the calibration quality is adequate to the intended uses of the camera.

Theoretical basis

Our methods are based on research conducted in the Fraunhofer Institute of Optronics, System Technologies and Image Exploitation (IOSB) and published in the scientific journals and reported at conferences.

F.A.Q

If you use a camera in any computer vision application, the calibration quality determines how accurately your computer perceives the real world. Moreover, if your camera settings or environment conditions change, it makes sense to regularly re-calibrate the cameras. Typical calibration workflows are cumbersome and prone to various errors and biases; we aim to remove unnecessary hurdles and provide access to top-quality calibration to anyone with a browser.

Probably. Many smartphones may be configured to behave as regular cameras, and will thus work just fine with our method of active targets. However, some premium smartphones may under the hood try to “improve” every picture by artificially painting or “hallucinating” some image details. This step may involve complex and undocumented algorithms that will interfere with our decoding logic, leading to poor-quality datasets.

Yes, but remember that changing any lens settings (F-number, zoom, or focus) may invalidate the calibration results.

Yes, the resolution of 1920x1080 pixels is sufficient for high-quality calibration. A more important property, however, is the monitor’s size: in all calibration images, the displayed pattern must occupy the entirety or a significant portion of the camera’s field of view, while the resulting images should be free from excessive “moire” effects. For some cameras, this is easier achieved with high-density (4K or 8K) monitors.

In the current version of our software we only support flat rectangular screens. However, solutions using curved gaming screens are known and could be implemented in our framework if we see sufficient demand for this feature.

In case you want to use our calibration service in your scientific work, please cite the following publication: A. Pak, S. Reichel , J. Burke. Machine - learning - inspired workflow for camera calibration, Sensors 22(18), 6804 (2022).

Still have questions? Send us an email