Guide to optical flow estimation from RGB image sequences.
Guide to optical flow estimation from RGB image sequences.
Optical Flow Overview
Optical flow is a computer vision technique that estimates the motion of objects between consecutive frames in an image sequence. By analyzing pixel-level movement, the algorithm can compute motion vectors that represent how each pixel moves from one frame to the next.
The OpticalFlowAlgorithm class provides a high-level interface for optical flow computation, while the underlying OpticalFlow class handles the core flow estimation logic.
Optical Flow Usage
Basic Optical Flow Computation
The following example demonstrates basic usage of the OpticalFlowAlgorithm where we assume as input we have a sequence of RGB images with name imageSequence represented as std::unique_ptr<SharedImageSet>.
#include <ImFusion/Base/SharedImageSet.h>
#include <ImFusion/Vision/OpticalFlowAlgorithm.h>
#include <ImFusion/Base/DataList.h>
opticalFlowComputer.p_opticalFlowName = "Optical Flow RAFT";
opticalFlowComputer.p_removeInvalidFlow = true;
opticalFlowComputer.compute();
Algorithm to compute the optical flow along a sequence of images.
Definition OpticalFlowAlgorithm.h:17
Wrapper class to store a list of owned Data instances.
Definition OwningDataList.h:24
std::unique_ptr< SharedImageSet > extractFirstImage(Data::Kind kind=Data::UNKNOWN, Data::Modality modality=Data::NA)
Extract the first SharedImageSet instance of given kind and modality.
Namespace of the ImFusion SDK.
Definition Assert.h:7
Optical Flow with uncertainty
When you want to compute optical flow along with per-pixel flow uncertainty
#include <ImFusion/Base/SharedImageSet.h>
#include <ImFusion/Vision/OpticalFlowAlgorithm.h>
#include <ImFusion/Base/DataList.h>
opticalFlowComputer.p_opticalFlowName = "Optical Flow SEARAFT";
opticalFlowComputer.p_removeInvalidFlow = true;
opticalFlowComputer.compute();
for (auto& output : outputImages)
if (output)
{
if (output->name() == "OpticalFlow")
opticalFlowResult = std::move(output);
else if (output->name() == "OpticalFlowUncertainty")
uncertaintyResult = std::move(output);
}
if (!opticalFlowResult)
throw AlgorithmExecutionError("Could not extract optical flow shared image set.");
std::vector< std::unique_ptr< SharedImageSet > > extractAllImages(Data::Kind kind=Data::UNKNOWN, Data::Modality modality=Data::NA)
Extract all SharedImageSet instances of given kind and modality.
- See also
- OpticalFlowAlgorithm
Custom Models
OpticalFlow allow running your custom models through our machine learning engines.
This is done by tracing or scripting your Torch or exporting your ONNX model and linking it through a configuration file.
Here is an example configuration file in .yaml format for OpticalFlow:
Version: '8.0'
Name: CustomModel
Description: Optical Flow
Name: torch
DisableJITRecompile: true
ModelFile: <path_to_model_file>
Verbose: true
InputFields: [FirstImage, SecondImage]
OutputFields: [Flow]
PreprocessingInputFields: [FirstImage, SecondImage]
PreProcessing:
- MakeFloat: {}
Sampling:
- SkipUnpadding: true
Base class and interface for images.
Definition Image.h:29
Generic interface for machine learning models serialized by specific frameworks (PyTorch,...
Definition Engine.h:47
PredictionOutput
Types of prediction output.
Definition MLCommon.h:56
@ NeuralNetwork
Neural network (standard value)
Definition MLCommon.h:42
@ ForceCPU
Force computation on the CPU.
Definition MLCommon.h:20
We require InputFields and OutputFields to have exactly the same names as mentioned above. In our current release, we only support the Depth output to be exposed to the user, returned as the first element by the model. Any additional outputs should still be appended in OutputFields and PredictionOutput with its return type but they will be discarded.
For further information on the machine learning configuration files, please refer to the Machine Learning Model page in our user documentation.