iSonic: Interactive Data Sonification for Blind Users

diagram showing a user in the center of a virtual semicircular map of the US A US map is partitioned into 3 by 3 ranges that can be activated using a keyboard numberpad
Figure 1: Sounds create the effect of a virtual map
Figure 2: Customized 3x3 map partition for exploration with a keyboard
Highly coordinated table view and map view of Maryland counties
Figure3: Highly coordinated table view and map view of Maryland counties

Project description:

Interactive data visualization tools are helpful to gain insight about data, find patterns and exceptions, but are usually non accessible to users with vision impairments. In the case of geo-referenced data where users need to combine demographic, economic or other data in a geographic context for decision-making, we designed iSonic, an interactive sonification tool that allows users to explore in highly coordinated table and choropleth map views of the data. Sounds of various timbres and pitches are tied to map regions and other interface widgets to create a virtual auditory data display. The integrated use of musical sounds and speech allows users to grasp the overall data trends and to explore for more details. We use the MIDI sound technique to provide high availability, and also use virtual 3-D sound technique, when available, to enhance user experiences.

Users use a standard computer keyboard, or a smooth surface touchpad when available, to interact with data. Examples of already implemented interactions include: (1) Automatically sweep to scan the map or the table to hear the data patterns; (2) Recursively partition the map into 3 by 3 ranges and use the keyboard number pad to explore each range, or use arrow keys to move among individual regions; (3) Glide a finger or press individual spots on a smooth surface touchpad to examine individual regions; The touchpad can be remapped to a partial map through zooming; (4) Dynamically adjust the auditory feedback information detail level. Our goal is to explore the design space by conducting user studies to identify effective sonification of choropleth maps and geo-referenced data, and examine the effectiveness of our tool in helping vision-impaired users. We also want to investigate the sonification of maps as a complement to visual maps for sighted users (e.g. to make "visible" the District of Columbia).

Participants

Haixia Zhao, Computer Science PhD Graduate Student
Catherine Plaisant, Associate Research Scientist, HCIL
Ben Shneiderman, Professor, Computer Science

in collaboration with:
Dmitry Zotkin and Ramani Duraiswami (Perceptual Interfaces and Reality lab)
Ben Smith and Kent Norman (Dept. Psychology).
Franco Delogu and Marta Olivetti Belardinelli(University of Rome, Italy)
Jonathan Lazar (Towson University)

Publications

Videos

iSonic v0.5 video(~ 5min 45sec) : best quality (Quicktime movie, 124MB) good quality (Quicktime movie, 74MB) Ok quality (Flash 6.0 movie, 24.6MB)

Demos

You can run the demo directly from the Web using the Java Web Start technology or download the package and run locally.

Source Code

Slides

Support

Related sites

Census project: for more info on the YMAP Interactive Maps
See the other Visualization Projects at HCIL.