ImFusion C++ SDK 4.4.0
Camera Calibration

Comprehensive guide to Camera Calibration.

Collaboration diagram for Camera Calibration:

Comprehensive guide to Camera Calibration.

This page provides detailed information and code examples for conducting calibration on a pinhole camera using CameraCalibration and its related classes. It is usually the first essential step for many computer vision applications such as 3D reconstruction, tracking and augmented reality.

Camera Calibration Overview

A pinhole camera model is a simplified mathematical representation of how a real camera projects 3D points in the world onto a 2D image plane. Camera calibration is the process of estimating the parameters of this model so that we can map between 3D world coordinates and 2D image coordinates accurately. It is a fundamental step in computer vision tasks such as 3D reconstruction, augmented reality, and robotic navigation.

Pinhole Model key parameters

  1. Intrinsic parameters: – Describe the internal geometry of the camera
    • Focal length (in pixel units along x and y axes)
    • Principal point (the optical center of the image)
    • Skew coefficient (typically zero in modern cameras, relates to non-orthogonality of pixel axes)
  2. Extrinsic parameters: – Define the camera’s position and orientation in the world
    • Rotation matrix.
    • Translation vector.

Camera Calibration Workflow

  1. Capture multiple images of a known calibration target (commonly a checkerboard or ChArUco Boards Marker Detection) from different viewpoints.
  2. Detect feature points (e.g., corners of the checkerboard Marker Detection).
  3. Solve a system of equations to estimate key parameters of Pinhole model using techniques such as Zhang’s method or bundle adjustment.
  4. Refine results
    • by minimizing projection error between the observed image points and the reprojected model points.
    • or by minimizing the standard deviation of the estimated parameters In our algorithm, we also introduced an approach to automatically select a subset of input images when their total number is large. This reduces computational time without compromising accuracy.

Camera Calibration in ImFusion

In ImFusion SDK, a camera calibration consists three parts:

  1. CameraCalibrationDataComponent - stores the calibration results including evaluation.
  2. CameraCalibrationSettings - stores the constraints and parameters for calibration
  3. CameraCalibrationAlgorithm - conduct the calibration

CameraCalibrationDataComponent

The CameraCalibrationDataComponent consists of three parts:

  • image size
  • intrinsic, extrinsic and distortion parameters
  • evaluation information: reprojection error, standard deviations estimated for intrinsic parameters and for camera distortion It also provides load/save functions to store calibration results and reuse them. The following example demonstrates how to use CameraCalibrationDataComponent: XML and JSON file are supported.
#include <ImFusion/Base/ImFusionFile.h>
#include <ImFusion/Base/SharedImageSet.h>
#include <ImFusion/Core/Filesystem/Path.h>
#include <ImFusion/Vision/CameraCalibrationDataComponent.h>
#include <ImFusion/Vision/CameraCalibrationAlgorithm.h>
using namespace ImFusion;
// Input image with calibration board
ImFusionFile file("charucoboard.imf");
std::optional<OwningDataList> dataList = file.load();
if(!dataList.has_value())
return;
std::unique_ptr<SharedImageSet> img = dataList->extractFirstImage();
// Do calibration, this will be explained later in detail;
alg.compute();
CameraCalibrationDataComponent* calibration = img->components().get<CameraCalibrationDataComponent>();
// always remember to check whether the result is valid;
if(calibration)
calibration->save(Filesystem::Path("calibration.xml"));
Camera calibration algorithm Description:
Definition CameraCalibrationAlgorithm.h:66
A data component storing mainly the intrinsic calibration of a pinhole camera Information such as ima...
Definition CameraCalibrationDataComponent.h:22
Entity representing a path in the filesystem.
Definition Path.h:58
Loading and saving of the ImFusion file format.
Definition ImFusionFile.h:74
Namespace of the ImFusion SDK.
Definition Changelog.dox:1
T has_value(T... args)

Camera Calibration Algorithm

A good camera calibration usually requires only about 30 high-quality images with calibration points evenly distributed. In our algorithm, these images can be automatically selected, which reduces computation time while ensuring the quality of the results. The following example demonstrates how to use our calibration algorithm:

#include <ImFusion/Base/ImFusionFile.h>
#include <ImFusion/Base/SharedImageSet.h>
#include <ImFusion/Vision/CameraCalibrationDataComponent.h>
#include <ImFusion/Vision/CameraCalibrationAlgorithm.h>
using namespace ImFusion;
// Input image set with calibration board
// Please refer to MarkkerDetection for more details
// Create marker configuration for Charuco board
config.setMarkerType(MarkerConfiguration::CharucoBoard);
// Configure Charuco board parameters
charucoParams.gridSize = vec2i(9, 7); // 9x7 grid
charucoParams.cellSize = 30.0; // 30 mm cell size
charucoParams.markerSize = 10.0; // 10 mm marker size
charucoParams.dictionary = 0; // ArUco dictionary
charucoParams.minAdjacentMarkers = 2; // Minimum adjacent markers for corner detection
config.setCharucoBoardParameters(charucoParams);
settings.setFixPrincipalPoint(true);
settings.setFixAspectRatio(true);
settings.setMinDetections(10); // 10 points per image is usually needed
settings.setRecalibrateWithInliers(true); // refine result with inliers images
settings.setUseAutoSelection(true); // it is recommended to use it especially with large dataset
settings.setMaxSelection(50); // 50 images are enough to generate a good calibration result
alg.setMarkerConfig(config);
alg.setCalibrationSettings(settings);
// If there is already a calibration in the image set and only need to compute the reprojection error
// setOnlyCompputeMREe(true);
alg.compute();
if (alg.status() != Algorithm::Status::Success)
return;
auto result = img->components().get<CameraCalibrationDataComponent>();
if (!result) // invalid result
return;
// Get the extrinsic parameters for each image
std::vector<mat4> cameraPoses = alg.cameraPoses();
@ Success
Computation was successful.
Definition Algorithm.h:61
The class stores all parameters/settings for camera calibration.
Definition CameraCalibrationSettings.h:14
void setFixAspectRatio(bool fix)
The ratio fx/fy stays the same as in the initial intrinsics.
Definition CameraCalibrationSettings.h:31
void setMaxSelection(int v)
Maximum number of images to select when.
Definition CameraCalibrationSettings.h:77
void setRecalibrateWithInliers(bool v)
Specifies whether calibration should be re-run with images that have per-frame MRE smaller than the o...
Definition CameraCalibrationSettings.h:61
void setUseAutoSelection(bool v)
If true, a subset of images will be automatically selected, instead of using all images.
Definition CameraCalibrationSettings.h:65
void setMinDetections(int v)
Specifies the minimum number of detected points per image.
Definition CameraCalibrationSettings.h:73
void setFixPrincipalPoint(bool fix)
Specifies whether the principal point should remain unchanged during the calibration (stays in the im...
Definition CameraCalibrationSettings.h:27
Describes the configuration of a marker calibration target.
Definition MarkerConfiguration.h:17
void setMarkerType(MarkerType type)
Set Marker type.
void setCharucoBoardParameters(const CharucoBoardInfo &params)
Set Charuco parameters.
T make_unique(T... args)
Information about a Charuco board.
Definition MarkerConfiguration.h:141

Camera Calibration Best Practices

Marker Type Selection Guidelines

Choose the appropriate marker type for your application:

  • Chessboard: Simple, reliable
  • Charuco Board: Robust, handles partial occlusion
  • Circle Board: Rotation invariant, good for industrial applications

Image Quality

Use High Quality Images

  • Typically, 20–30 well-captured images are sufficient.
  • More images can improve robustness, but beyond a point, quality matters more than quantity.
  • Use sharp images with minimal motion blur
  • Ensure calibration patterns are clear visible and in focus
  • Calibration patterns should cover at least half of the image
  • Redundancy can increase computation without improving accuracy.

Working Distance and Positions

  • The camera should be calibrated within the approximate working distance
  • Distribute calibration points evenly across the entire image frame.
  • Capture images with the calibration target at different orientations, tilts, and distances.
  • Avoid using images where points cluster in one region of the image.

Camera Calibration Troubleshooting

Common Issues and Solutions

Please refer to Marker Detection for more detail

No Detection

  • Symptoms: No markers detected despite being visible
  • Solutions:
    • Check marker configuration parameters
    • Verify marker size and grid size
    • Adjust detector parameters for lighting conditions
    • Ensure good contrast between marker and background

low limit of detected number of points

  • Symptoms: A common cause of invalid results is having too few detected points in the image.
  • Solutions:
    • Increase the parameter: minDetections
    • Use a more suitalbe calibration board
See also
CameraCalibrationDataComponent, CameraCalibrationSettings, CameraCalibrationAlgorithm
Search Tab / S to search, Esc to close