
Visage Webcam Tracker
Category:
AddIn:
Scope:
Code Snippets:
Supports Material List:
Status Screen Widgets:
License:
Eye Trackers
Visage Technologies
Global
no
no
no
Requires license from Visage Technologies
In this article:
The Visage Webcam Tracker enables eye-tracking using standard webcams, making advanced gaze analysis accessible for research. It provides real-time estimates of gaze direction, fixation, and pupil position, supporting experiments in psychology, neuroscience, and human–computer interaction.
Description
The Visage Webcam Tracker element enables eye-tracking and pupil measurements using a standard webcam. Powered by the Visage Technologies SDK, the element applies advanced face analysis algorithms to estimate gaze direction, pupil positions, and fixation points without the need for specialized eye-tracking hardware.
With built-in calibration and filtering options, the tracker can achieve reliable accuracy for experiments in psychology, neuroscience, usability research, and HCI studies. The element supports GLM and affine calibration models, drift correction during runtime, and perspective adjustments to account for camera and screen geometry.
The Visage Webcam Tracker also provides real-time tracker samples, gaze coordinates, and pupil data, which can be logged and synchronized with other experimental signals. This makes it highly suitable for studies where affordable, camera-based eye tracking is preferred.
Key Features
Webcam-only eye tracking – no special hardware required, works with standard camera devices.
GLM and affine calibration for accurate mapping of gaze positions to screen coordinates.
Drift correction during runtime to maintain accuracy in long sessions.
Perspective correction compensates for camera angle and viewer position.
Real-time gaze and pupil data available for stimulus control and logging.
Customizable logging with flexible output formats and user-defined fields.
Noise reduction via smoothing filters for stable gaze signals.
Supports multiple faces, with selection via Face Index property.
Compatible with EventIDE’s logging and analysis pipeline, enabling synchronization with stimuli, responses, and biosignals.
Requires Visage Technologies license for activation and use.
Properties
Name | Description | Property Class | Type | On runtime change |
Visage Settings | ||||
License File | Defines the location of the Visage Technologies license file required for all Visage elements | Design | String | |
Analysis Rate | Defines a rate (Hz) of the analysis performed by the Visage SDK. Shared among all Visage elements | Design | Double | |
Max Faces | Defines the maximal number of tracked faces | Design | Int32 | |
Face Index | Index of the face selected for eye-tracking when multiple persons are recorded. Zero = first recognized face | General | Int32 | |
Angular Units | Defines units for angular tracking data | Design | Int32 | |
Camera Settings | ||||
Camera | Selected camera device | Design | String | |
Frame Size | Pixel resolution of captured webcam frames. Must be supported by selected camera | General | clSize | |
Tracker Samples | ||||
Newest Samples | Array of eye-tracker samples taken in the last iteration of the control loop | Status | clVisa… | |
Newest Sample | The newest eye-tracker sample taken in the last iteration of the control loop | Status | clVisa… | |
GLM Calibration | ||||
Run GLM Calibration | Runs the GLM calibration procedure, automatically applying coefficients at runtime | Design | Boolean | |
Gain X | Gain coefficient for X axis in GLM calibration | General | Double | |
Gain Y | Gain coefficient for Y axis in GLM calibration | General | Double | |
Offset X | Offset coefficient for X axis in GLM calibration | General | Double | |
Offset Y | Offset coefficient for Y axis in GLM calibration | General | Double | |
Affine Mode | Enables affine transformation (scale, translation, rotation) instead of GLM | General | Boolean | |
Affine Rotation | Rotation angle (deg) for affine calibration | General | Double | |
Save Calibration Now | Saves the current GLM calibration into an XML file | Runtime Command | String | |
Load Calibration Now | Loads GLM calibration values from an XML file | Runtime Command | String | |
Drift Correction | ||||
Recalibrate Now | Runtime command to correct drift by recalibrating the tracker position into a custom screen point | Runtime Command | Boolean | |
Recalibration Point | Screen point used in recalibration. Default = screen center | General | clPoint | |
Filters | ||||
Smoothing Ratio | Ratio for FIR smoothing filter applied to gaze tracking | General | Double | |
Logging | ||||
Open Log Designer | Opens designer window to define log format | Design | Boolean | |
Is Logging Now | Allows temporary pause of tracker logging at runtime | General | Boolean | |
Custom Field | User-defined data added to the log file | General | String | |
Data Report Label | String label appended to report file names | General | String | |
Runtime | ||||
Radar Point | Newest calibrated tracking position at runtime | General | clPoint | |
Perspective Correction | ||||
Perspective Correction | Enables correction for perspective distortions | General | Boolean | |
Configure Perspective | Opens setup for perspective calibration | Design | Boolean | |
Viewer Position | 3D position of the viewer for perspective correction | General | Point3D | |
Tracker Position | 3D position of the tracker for perspective correction | General | Point3D | |
Focal Length | Focal length parameter for perspective correction | General | Double | |
Control | ||||
Is Enabled | If set to false the element is completely omitted when the experiment is run | Design | Boolean | |
Title | Title of the element | Design | String |
Practical Use
Technique 1: Webcam-based Eye Tracking with GLM Calibration
Add the Visage Webcam Tracker to your experiment and configure the Camera and Frame Size.
Run GLM Calibration to map eye-tracker input into screen coordinates.
Access Newest Sample or Radar Point during trials to obtain gaze positions.
Use gaze data to determine fixation points or dynamically control stimuli presentation.
Technique 2: Drift Correction During Long Experiments
Start with a standard calibration at the beginning of the experiment.
During runtime, call Recalibrate Now when drift is observed, aligning gaze to a known Recalibration Point.
Continue data collection with adjusted offsets, ensuring accuracy over extended tasks.
Save calibration with Save Calibration Now for reuse across sessions.
Notes
Requires valid Visage Technologies license file for activation. Ensure the Visage license file is valid and properly linked in the experiment via the License File property of the element.
Accuracy depends on camera resolution, frame rate, and lighting conditions.
Unlike infrared eye-trackers, webcam-based tracking may be more sensitive to head movement and illumination changes.
Use Smoothing Ratio to reduce noise in gaze data but avoid excessive smoothing that may reduce responsiveness.
For high-precision applications, regular drift correction is recommended.
