Gaze Tracker sample project
Gaze Tracker sample uses the VisageSDK API to implement a simple application which demonstrates screen space gaze tracking.
Installing and running the project
The project must be installed on a web server in order to run (it can not run from a local disk). For a quick trial from Visage Technologies web server, click
here and choose Gaze Tracker.
To install the application on your own web server, upload folders lib and Samples to your web server. To run the application access the Samples/GazeTracker/gazeTracker.html page via a web browser.
The application must be licensed in order to function. Licensing is based on License Key Files. To obtain the License Key File:
- Contact your Visage Technologies contact person in order to obtain a license key.
- Copy the obtained license key file into your application folder (e.g. Samples/GazeTracker).
- Edit GazeTracker/GazeTracker.html. Replace the "dev_html5.vlc" in variables licenseName and licenseURL at the bottom of the file with the name of your license key file.
- NOTE: License Key File must be preloaded before registration calling VisageModule.FS_createPreloadedFile() function.
An example of how this is done in the sample:
VisageModule.FS_createPreloadedFile('/', licenseName, licenseURL, true, false, function(){ }, function(){ alert("Loading License Failed!") });
For further information please read licensing section of the documentation.
Using the sample application
- Running the sample requires a recent browser.
- The SDK needs to download its data files before using the application.
- Allow the application to access your camera by choosing "Allow" in the browser's pop-up bar.
- Look straight at the camera. Tracking will start.
- Once the tracker has initialized you can start the calibration phase of screen space gaze estimation.
- In the calibration phase, the application displays a calibration point as the red dot. By clicking on the dot, the position of the dot will change.
- After all calibration points are clicked, the gaze tracking system is calibrated and the estimation phase starts. Calibration continues optionally so that the results can be improved upon.
- In the estimation phase, the application draws the estimated gaze location as a blue dot on the screen.
Implementation overview
Gaze Tracker is implemented in HTML and JavaScript using the VisageSDK API for face tracking and screen space gaze estimation, getUserMedia API to access the camera and native canvas methods for rendering.
Face tracking results are updated using requestAnimationFrame API.
Browser compatibility
Gaze Tracker sample is tested on the recent browsers on Windows, Mac, Android and iPhone platforms.
Results are represented in the following table (NOTE: Application is not adapted to mobile devices.).
|
|
|
|
|
|
|
96 |
91 |
81 |
88 |
14 |
|
95 |
93 |
64 |
-- |
-- |
|
-- |
-- |
-- |
-- |
14 |