EyeInfo Research Group

Low-Cost Eye Trackers for Sports Training

This research project is being developed by Camilo Rodegheri Mendes dos Santos and Martina Navarro, under orientation of Prof. Ronald Ranvaud at the Laboratory of Physyology of Behavior, Department of Physiology and Biophysics, Institute of Biomedical Sciences of the University of Sao Paulo, São Paulo, Brazil.

Figure 14: Player using a head-mounted eye tracker.

The main goal is to develop training strategies to improve perceptual-cognitive skills of athletes incorporating low cost eye-tracker equipment. More specifically, it aims to provide for any athlete an accessible (i.e. very low investment) and reliable equipment that can robustly evaluate individual gaze patterns. Based on these individual evaluations, each athlete could be further instructed according appropriate training strategies. Additionally, these low cost eye-trackers can also offer a method in monitoring the adoption and the effectiveness of the specific training strategy.

This project uses Haytham gaze tracker as a low cost solution for mobile gaze estimation, as shown in Figures 14 and 15.

Figure 15: The player using the head-mounted eye tracker during a training session.

Visual search of taekwondo expert before the opponent’s attack. The athlete had two very long fixations before being attacked, first (Figure 15.1) in the head and the second (Figure 15.2) in the shoulder region.


Eye Tracking Experiment at ARKEN Museum of Copenhagen

There was an exhibition in the ARKEN art museum of Copenhagen and there was a very cool installation there designed by the artist Olafur Eliasson. The installation was called "Your blind passenger" and it was basically a 90-metre-long tunnel with a dense fog inside, and some white and yellow color lights at the beginning and the end of the tunnel (Figure 16). It is interesting that when you go through the tunnel and go from the yellow color area to the white area you can actually see the afterimage effect and instead of seeing the white light you see a purple background in the field of view (Lateral Inhibition Phenomena). The eye tracking experiment was conducted to see how the eyes behave inside the tunnel when there is no stimuli in the field of view. In the video below you can see the recorded eye and the scene videos with the gaze point in three different trials.

Figure 16: Your blind passage exposition.

 

 


Interacting with Objects in the Environment by Gaze and Hand Gestures

This project was done by Jeremy Hales under supervision of David Rozado at ICT Centre of CSIRO, Australia. The project was about using gaze and hand gestures in a multimodal interaction senario for controlling the environment, as shown in Figure 17. A wireless head-mounted gaze tracker made from low-cost hardware was used for both estimating the point regard and for detecting the hand gestures in the scene image. The haytham gaze tracker was used for continuous and mobile monitoring of a subject’s point of regard and for recognizing the visual markers attached to the objects in the surrounding environment. The emerging system permitted a subject to move freely in an environment, select the object he wants to interact with using gaze (identification) and transmit a command to it by performing a hand gesture (control).

Figure 17: Multimodal interaction using gaze and hand gestures for controlling an environment.


Mobile Gaze-Based Controlling a Vehicle

This was a bachelor project done by Per Mortensen and Jacob Rasmussen at IT University of Copenhagen. In essence the project concerns a product that allows a user to control a Roomba vacuum cleaning robot inside a given room using data from a mobile gaze tracker (Figure 18). When the user looks at a point in the room and performs a certain gesture, the Roomba will attempt to navigate towards the given destination point and optionally start cleaning on arrival.

Figure 18: Mobile gaze-based controlling a Roomba vacuum.

This was a bachelor project done by Per Mortensen and Jacob Rasmussen at IT University of Copenhagen. In essence the project concerns a product that allows a user to control a Roomba vacuum cleaning robot inside a given room using data from a mobile gaze tracker. When the user looks at a point in the room and performs a certain gesture, the Roomba will attempt to navigate towards the given destination point and optionally start cleaning on arrival, as show in Figure 19.

Figure 19: Setup of Roomba vacuum controlled by eye movements.

The software developed in this project communicate with the Haytham gaze tracking server and receives the gaze data and the recognized visual marker in the scene image. It finds the path for the roomba to go from the current position to the gazed destination point and sends the control commands to the Roomba.

Home

Visit our main homepage and find out more about the EyeInfo Research Group

Research

Meet the diverse research areas in which we have interest to work

Articles

Download some of ours latest articles published in symposiums and journals.

Team

Meet our research team and learn about our research group structure.

Contact

For more information about our research group, please feel free to contact us.