No need to print or purchase special calibration targets
Use any flat LCD display as a dynamic high-precision target.
Modern displays provide the localization uncertainty of view rays down to tens of microns, which is 10-100 times better than the dimensional stability of typical static targets.
No need to write any auxiliary programs or download any software
All the heavy-duty data processing runs on our servers.
We use the most advanced algorithms that originate from the field of optical metrology, and leverage optimized GPU-based implementations.
Active target techniques (ATTs) ensure fundamentally better-quality data
All calibration methods rely on «landmarks»: points that are recognized in the images and whose positions in the 3D world are known. With static patterns, typical datasets contain at most a few dozen such points.
ATTs developed for the most demanding measurements, turn every pixel on the screen into a «landmark». Dense datasets produced by ATTs may contain millions of decoded points, which leads to significantly more robust and reliable calibration.
Moreover, our decoding algorithms also estimate the position uncertainties for each pixel, which is essential for the assessment of calibration quality.
Advanced analytics and quality assurance for calibration outcomes
Along with the model parameters, we generate exhaustive graphics and reports that characterize the consistency, the reliability, and the reproducibility of the calibration results.
The residual model uncertainty is projected to the 3D world and quantified in meters rather than in pixels: with that, one can easily decide if the calibration quality is adequate to the intended uses of the camera.
Theoretical basis
Our methods are based on research conducted in the Fraunhofer Institute of Optronics, System Technologies and Image Exploitation (IOSB) and published in the scientific journals and reported at conferences.
A. Pak, S.Reichel, J. Burke. Machine-learning-inspired workflow for camera calibration, Sensors 22(18), 6804 (2022)
A. Pak. Metrological camera calibration paradigm, In: Proc. 3rd European Machine Vision Forum, Ed.: B.Jähne (2018)
A. Pak. The concept and implementation of smooth generic camera calibration, tm-Technisches Messen 83(1), 25–35 (2016)
Why do I need better camera calibration?
If you use a camera in any computer vision application, the calibration quality determines how accurately your computer perceives the real world. Moreover, if your camera settings or environment conditions change, it makes sense to regularly re-calibrate the cameras. Typical calibration workflows are cumbersome and prone to various errors and biases; we aim to remove unnecessary hurdles and provide access to top-quality calibration to anyone with a browser.
Can I calibrate a smartphone camera?
Probably. Many smartphones may be configured to behave as regular cameras, and will thus work just fine with our method of active targets. However, some premium smartphones may under the hood try to “improve” every picture by artificially painting or “hallucinating” some image details. This step may involve complex and undocumented algorithms that will interfere with our decoding logic, leading to poor-quality datasets.
Can I calibrate a DSLR camera?
Yes, but remember that changing any lens settings (F-number, zoom, or focus) may invalidate the calibration results.
Can I use a Full-HD display for calibration?
Yes, the resolution of 1920x1080 pixels is sufficient for high-quality calibration. A more important property, however, is the monitor’s size: in all calibration images, the displayed pattern must occupy the entirety or a significant portion of the camera’s field of view, while the resulting images should be free from excessive “moire” effects. For some cameras, this is easier achieved with high-density (4K or 8K) monitors.
Can I use a curved monitor?
In the current version of our software we only support flat rectangular screens. However, solutions using curved gaming screens are known and could be implemented in our framework if we see sufficient demand for this feature.
How do I cite the algorithm?
In case you want to use our calibration service in your scientific work, please cite the following publication: A. Pak, S. Reichel , J. Burke. Machine - learning - inspired workflow for camera calibration, Sensors 22(18), 6804 (2022).