EyeInfo Research Group

This project aims to employ HMGT to interact with real objects in the real-world physical environment without necessarily having a monitor and GUI. A user wearing a HMGT is able to control different objects in the environment directly just by looking at objects, eye movements and also gestures (e.g., head gestures), as show in Figure 1. Besides controlling the objects, having the user's gaze pattern and what he/she pays attention to during the day (in a fully mobile situation) may provide unique information for the future home automation. In general, mobile gaze-based environment control in the context of house automation can help people to autonomously interact with smart environment by gaze. In particular, it can be more useful for improving the care of the elderly and people who have severe motor disabilities, like ALS (Amyothrophic Lateral Sclerosis), since they generally retain normal control of their eyes.

Figure 1: Example of devices controlled by a head-mounted eye tracker.



Visit our main homepage and find out more about the EyeInfo Research Group


Meet the diverse research areas in which we have interest to work


Download some of ours latest articles published in symposiums and journals.


Meet our research team and learn about our research group structure.


For more information about our research group, please feel free to contact us.