GazePrompt -- Enhancing Low Vision People’s Reading Experience with Gaze-Aware Augmentations

PI -- Prof. Yuhang Zhao (Mad Ability Lab, Computer Sciences, UW-Madison), Collaborators -- Ru Wang, Zach Potter, Daniel Killough, Linxiu Zeng, Sanbrita Mondal

About this Project: In response to reading difficulties encountered by people with low vision, we designed gaze-aware reading aids offering visual and auditory cues tailored to the user’s gaze patterns. Primary features include highlighting the intended reading line and reading difficult words aloud.

My main contributions:

• Created a word-difficulty dataset for low-vision people by identifying hesitations and misreads

• Developed hypotheses about the causes of misreads, referring to natural language processing (NLP) and psycholinguistic research

• Analyzed gaze data to investigate difficult word detection algorithms