Navigation
The Navigation module provides tools to process real-time tracking data - referred to as tracking streams - and use this information for visualization and guidance in surgical or diagnostic scenarios.
Tracking data can be sourced from optical or electromagnetic systems, robotic devices, or other sensors that provide real-time positional data. For an overview of natively supported tracking hardware, see the Tracking Plugins section.
This module includes various utilities for combining and smoothing tracking poses, and displaying tools in 3D scenes. It also supports navigation workflows that guide a tool or object toward a target point along a predefined trajectory.
Key features of the Navigation module include:
Smoothing and stabilizing noisy tracking input
Combining tracking tools with reference datasets such as MR or CT scans in a unified scene * Interactive calibration of tools * Definition of reference coordinate systems using tracker-attached objects to maintain scene consistency upon movement * Landmark registration by identifying coordinates with a tracked pointer to bring all objects into one scene
Real-time 3D visualization of custom objects and synchronized updates of MPR views to provide spatial context
Interactive navigation to a target location via a predefined path * Trajectory planning and target point definition * Visual feedback during navigation in 3D environments
Tracking Plugins
The following plugins provide integration with supported tracking systems:
NDI Tracking Plugin: Aurora (electromagnetic) and Polaris Vega/Lyra (optical) tracking systems
Atracsys Tracking Plugin: fusionTrack and spryTrack optical systems
OptiTrack Tracking Plugin: OptiTrack tracking bar cameras, e.g. Duo3 and Trio3 optical tracking system