Gaze-based Interaction

Technology photo created by creativeart - www.freepik.com

Gaze-based applications enable hands-free interactions that are suitable for users with disabilities, children, or the elderly for their basic needs. However, these kinds of interactions suffer from several limitations and challenges. One of the main challenges of gaze-based interactions is the ability to distinguish normal eye function from a deliberate interaction with the computer system, commonly referred to as ‘Midas touch’.

We proposed EyeTAP (Eye tracking point-and-select by Targeted Acoustic Pulse), a hands-free interaction method for point-and-select tasks to address the Midas touch by audio processing with abilities to reduce ambient noise with a tolerance of up to 70 dB. These characteristics make it applicable for users for whom physical movements are restricted or not possible due to a disability. Furthermore, EyeTAP has no specific requirements in terms of user interface design and therefore it can be easily integrated into existing systems with minimal modifications.

Web surfing using EyeTAP

Further, eye tracking studies enable researchers to study and analyze users’ mental patterns such as emotion detection, stress, and eye fatigue recognition. We proposed a mathematical model for eye tracking applications (FELiX) to measure eye fatigue in computer users in our recent paper.

Publications

  1. Mohsen Parisay, Charalambos Poullis, Marta Kersten, “EyeTAP: A Novel Technique using Voice Inputs to Address the Midas Touch Problem for Gaze-based Interactions”, 2020. https://arxiv.org/abs/2002.08455
  2. Mohsen Parisay, Charalambos Poullis, Marta Kersten-Oertel, “FELiX: Fixation-based Eye Fatigue Load Index A Multi-factor Measure for Gaze-based Interactions”, 13th International Conference on Human System Interaction (HSI), Tokyo, Japan, 2020 (Best Paper Award finalist)