Comprehensive guide to Camera Calibration.
Comprehensive guide to Camera Calibration.
This page provides detailed information and code examples for conducting calibration on a pinhole camera using CameraCalibration and its related classes. It is usually the first essential step for many computer vision applications such as 3D reconstruction, tracking and augmented reality.
Camera Calibration Overview
A pinhole camera model is a simplified mathematical representation of how a real camera projects 3D points in the world onto a 2D image plane. Camera calibration is the process of estimating the parameters of this model so that we can map between 3D world coordinates and 2D image coordinates accurately. It is a fundamental step in computer vision tasks such as 3D reconstruction, augmented reality, and robotic navigation.
Pinhole Model key parameters
- Intrinsic parameters: – Describe the internal geometry of the camera
- Focal length (in pixel units along x and y axes)
- Principal point (the optical center of the image)
- Skew coefficient (typically zero in modern cameras, relates to non-orthogonality of pixel axes)
- Extrinsic parameters: – Define the camera’s position and orientation in the world
- Rotation matrix.
- Translation vector.
Camera Calibration Workflow
- Capture multiple images of a known calibration target (commonly a checkerboard or ChArUco Boards Marker Detection) from different viewpoints.
- Detect feature points (e.g., corners of the checkerboard Marker Detection).
- Solve a system of equations to estimate key parameters of Pinhole model using techniques such as Zhang’s method or bundle adjustment.
- Refine results
- by minimizing projection error between the observed image points and the reprojected model points.
- or by minimizing the standard deviation of the estimated parameters In our algorithm, we also introduced an approach to automatically select a subset of input images when their total number is large. This reduces computational time without compromising accuracy.
Camera Calibration in ImFusion
In ImFusion SDK, a camera calibration consists three parts:
- CameraCalibrationDataComponent - stores the calibration results including evaluation.
- CameraCalibrationSettings - stores the constraints and parameters for calibration
- CameraCalibrationAlgorithm - conduct the calibration
CameraCalibrationDataComponent
The CameraCalibrationDataComponent consists of three parts:
- image size
- intrinsic, extrinsic and distortion parameters
- evaluation information: reprojection error, standard deviations estimated for intrinsic parameters and for camera distortion It also provides load/save functions to store calibration results and reuse them. The following example demonstrates how to use CameraCalibrationDataComponent: XML and JSON file are supported.
#include <ImFusion/Base/ImFusionFile.h>
#include <ImFusion/Base/SharedImageSet.h>
#include <ImFusion/Core/Filesystem/Path.h>
#include <ImFusion/Vision/CameraCalibrationDataComponent.h>
#include <ImFusion/Vision/CameraCalibrationAlgorithm.h>
return;
alg.compute();
if(calibration)
Camera calibration algorithm Description:
Definition CameraCalibrationAlgorithm.h:66
A data component storing mainly the intrinsic calibration of a pinhole camera Information such as ima...
Definition CameraCalibrationDataComponent.h:22
Entity representing a path in the filesystem.
Definition Path.h:58
Loading and saving of the ImFusion file format.
Definition ImFusionFile.h:74
Namespace of the ImFusion SDK.
Definition Assert.h:7
Camera Calibration Algorithm
A good camera calibration usually requires only about 30 high-quality images with calibration points evenly distributed. In our algorithm, these images can be automatically selected, which reduces computation time while ensuring the quality of the results. The following example demonstrates how to use our calibration algorithm:
#include <ImFusion/Base/ImFusionFile.h>
#include <ImFusion/Base/SharedImageSet.h>
#include <ImFusion/Vision/CameraCalibrationDataComponent.h>
#include <ImFusion/Vision/CameraCalibrationAlgorithm.h>
charucoParams.gridSize = vec2i(9, 7);
charucoParams.cellSize = 30.0;
charucoParams.markerSize = 10.0;
charucoParams.dictionary = 0;
charucoParams.minAdjacentMarkers = 2;
alg.setMarkerConfig(config);
alg.setCalibrationSettings(settings);
alg.compute();
return;
if (!result)
return;
@ Success
Computation was successful.
Definition Algorithm.h:61
The class stores all parameters/settings for camera calibration.
Definition CameraCalibrationSettings.h:14
void setUseAutoSelection(bool v)
If true, a subset of images will be automatically selected, instead of using all images.
Definition CameraCalibrationSettings.h:65
void setFixPrincipalPoint(bool fix)
Specifies whether the principal point should remain unchanged during the calibration (stays in the im...
Definition CameraCalibrationSettings.h:27
void setMaxSelection(int v)
Maximum number of images to select when.
Definition CameraCalibrationSettings.h:77
void setFixAspectRatio(bool fix)
The ratio fx/fy stays the same as in the initial intrinsics.
Definition CameraCalibrationSettings.h:31
void setRecalibrateWithInliers(bool v)
Specifies whether calibration should be re-run with images that have per-frame MRE smaller than the o...
Definition CameraCalibrationSettings.h:61
void setMinDetections(int v)
Specifies the minimum number of detected points per image.
Definition CameraCalibrationSettings.h:73
Describes the configuration of a marker calibration target.
Definition MarkerConfiguration.h:17
void setCharucoBoardParameters(const CharucoBoardInfo ¶ms)
Set Charuco parameters.
void setMarkerType(MarkerType type)
Set Marker type.
Information about a Charuco board.
Definition MarkerConfiguration.h:141
Camera Calibration Best Practices
Marker Type Selection Guidelines
Choose the appropriate marker type for your application:
- Chessboard: Simple, reliable
- Charuco Board: Robust, handles partial occlusion
- Circle Board: Rotation invariant, good for industrial applications
Image Quality
Use High Quality Images
- Typically, 20–30 well-captured images are sufficient.
- More images can improve robustness, but beyond a point, quality matters more than quantity.
- Use sharp images with minimal motion blur
- Ensure calibration patterns are clear visible and in focus
- Calibration patterns should cover at least half of the image
- Redundancy can increase computation without improving accuracy.
Working Distance and Positions
- The camera should be calibrated within the approximate working distance
- Distribute calibration points evenly across the entire image frame.
- Capture images with the calibration target at different orientations, tilts, and distances.
- Avoid using images where points cluster in one region of the image.
Camera Calibration Troubleshooting
Common Issues and Solutions
Please refer to Marker Detection for more detail
No Detection
- Symptoms: No markers detected despite being visible
- Solutions:
- Check marker configuration parameters
- Verify marker size and grid size
- Adjust detector parameters for lighting conditions
- Ensure good contrast between marker and background
low limit of detected number of points
- Symptoms: A common cause of invalid results is having too few detected points in the image.
- Solutions:
- Increase the parameter: minDetections
- Use a more suitalbe calibration board
- See also
- CameraCalibrationDataComponent, CameraCalibrationSettings, CameraCalibrationAlgorithm