PhD Proposal: Embodied Non-Visual Interaction: Tangible, Haptic, and Robotic Mediation of Graphical User Interface for Blind Individuals

Talk
Jiasheng Li
Time: 
01.22.2026 10:00 to 11:30

Graphical user interfaces (GUIs) pervade everyday life from online web interfaces to offline touchscreen kiosks. Their visual dependent design brings intuitive and convenient interactions for sighted people but also leave it challenges for blind and low-vision (BLV) users. While accessibility standards and tools (for example, WCAG 2.0 and screen readers) open access to text-based content and basic controls, they often omit graphical information or compress images and charts into lengthy text descriptions. This compression of information limits the understanding of graphical information and even restrict the capabilities to create graphics content.
This research investigates how to support BLV users in both accessing and creating graphical information through non-visual modalities across varied GUIs in both 2D and 3D environment. The aim is to enable BLV users to work across different inaccessible GUIs with a common set of methods, turning them into usable, familiar interfaces and lowering the learning burden and cognitive load.
To achieve this goal, I present three projects. First, I investigate a robotic mediator that operates across inaccessible GUIs in public spaces to enable access to and manipulation of on-screen content. Second, I design a tangible interface that maps webpage layouts to a physical grid so BLV users can understand structure and author their own layouts. Third, I study haptic cues on the dorsal hand that convey spatial direction and distance beyond two-dimensional screens, supporting access to three-dimensional virtual environments. Together, these efforts define a mediation layer that translates visually encoded layout, state, and motion into accessible audio and haptic cues, supporting both understanding and creation for BLV users.