I am a Ph.D. student and scientific employee of the AC department of Stuttgart University, working with Prof. Dr. Steffen Staab and Dr. Chandan Kumar. My research focus is on human-computer interaction.
Currently, I work on the ZIM-funded project MICME: Multimodal interaction and collaboration In Medical Environments. The project is about combining eye tracking with modalities to support surgeons in complex operations. My research is about multimodal interaction combining eye tracking with touch or non-lexical voice input. I have published papers at ACM showcasing a novel method of eye typing at ACM CHI and ACM ETRA. I am also a tutor of the HCIIR and the Machine Learning course at the University of Stuttgart and supervise student theses. Before pursuing my Ph.D., I have gained 7 years of industry experience working for companies such as Bliksund in Norway and the Union Betriebs-GmbH in Bonn. Among other Web-oriented IT projects, I have developed a rules repository system for CDU and implemented the personal homepage of Angela Merkel.
Download my resumé.
MSc in Web Science, 2019
University of Koblenz-Landau
BSc in Sofware Enginieering, 2012
Azad Universtiy of Mashhad
This paper introduces Hummer, a hands-free text entry method that works by hum and gaze.
We developed a gaze immersive YouTube player, called GIUPlayer, with two objectives. First to enable eye-controlled interaction with video content, to support people with motor disabilities. Second to enable the prospect of quantifying attention when users view video content, which can be used to estimate natural viewing behaviour. In this paper, we illustrate the functionality and design of GIUPlayer, and the visualization of video viewing pattern. The long-term perspective of this work could lead to the realization of eye control and attention based recommendations in online video platforms and smart TV applications that record eye tracking data.