Here is the problem we want to solve: The user is sitting in front of a computer display and is looking at a point in the display. There is a camera looking at the user’s eye and we want to use that image to find out where the user is looking at. There are some eye features that can be detected and tracked in the eye image. Some of these features are pupil centre, limbus (border of the iris) and the eye corners. Many gaze trackers use pupil centre together with the reflection of a light source (from the anterior surface of the cornea) for estimating the gaze point (PoR).
One way of using these features in the image and relating them to the gaze point in space is to follow a geometrical method and find person’s gaze vector in space. Once we find the gaze vector relative to our world coordinate system we can then find the intersection of this vector with the planar screen in from of the user and this intersection would be the gaze point. This method is basically a direct way of find the gaze point in space and it requires a calibrated setup and knowledge of the geometry of the eye model and the system components. If you are interested in studying the mathematical details of this method read this paper: General Theory of Remote Gaze Estimation Using the Pupil Center and Corneal Reflections.
Although there are many different methods for gaze estimation, a low precision gaze tracking can be achieved by a simple interpolation. In the interactive figures below you can see the basics of the interpolation based methods that map the pupil center to a point inside a 2-dimentional plane in front of the eye. Let’s assume that the gaze point is always lies on a plane (e.g. a computer display) in front of the eye. Let’s call this plane fixation plane. Figure 1 shows a schematic illustration of the main elements of a remote gaze tracker setup. The camera is shown by a triangle indicating the camera image and the projection center. A simplified model of the eye is shown with its optical/visual axes. The visual axis intersects the fixation plane in the gaze point (PoR). You can also see how a light emitted from a light source will be reflected from the surface of the cornea and is projected to the camera image. You can interact with this figure and move the eye and the camera by dragging the red circles. Play around with this figure and see how the projection of the pupil center and the light source in the image change when you change the gaze direction.
Figure 1: Main elements of a remote gaze tracker setup.