Metrics for the Evaluation of Visual Analytics
A Vis-2007 Workshop:

A full day workshop, Sunday October 28th at the VIS 2007 conference


Agenda and Position Papers

Before the workshop:
- please read the position papers
- prepare your 10 minute briefing, focusing on the metrics you would advocate using for the evaluation
- some of you have multiple folks on your position paper, please let us know how many will be attending before the workshop.  We encourage as many of you to attend as possible.

Timing:  8:30am to 5:45pm – see also VIS 2007 online program.  

Location:  please check the VIS final program when you register

Proposed agenda
8:30 - 8:45 am              Introduction
8:45- 10:00                  7  10 minute presentations
10:00 - 10:30              Break
10:30 - 10:50               2 10 minute presentations
10:50 - 12:00 PM        Discussion of metrics to try in the afternoon
12:00 pm - 2:00 PM    Group lunch
2:00 - 3:30                   Divide into two groups, one group applies metrics to Jigsaw, the other to Oculus.
3:30 - 4:00                  Break
4:00 - 5:00                   Switch groups  ( I know the time is uneven but it should go faster the second time around:-)
5:00 - 5:45                   Discussion; future

Logistics: We will have laptops with the two Contest submissions, Jigsaw and Oculus, available at the workshop for use in the two groups. 

Position papers:

Introducing additional domain-specific measures in evaluating visual analytic tools

Mark A. Whiting, Carrie Varley, Jereme Haack
Pacific Northwest National Laboratory

Toward Measuring Visualization Insight
Chris North, Virginia Tech

Issues and Methodologies for Evaluating the Jigsaw Visual Analytic System
Carsten Görg Sarah Williams, John Stasko
Georgia Institute of Technology

Imago: An integrated prototyping, evaluation and transitioning environment for information visualization
Rudi Vernik1, G Stewart Von Itzstein1, Alain Bouchard2
1DSTO Australia 2DRDC Valcartier, Quebec, Canada

Process and Productivity in Visual Analytics: Reflections on E-Discovery
Sean M. McNee and Ben Arnette
Attenex Corporation

A Framework for User-Centered Evaluation for a Visual Analytics Contest
Sharon Laskowski, Theresa O’Connell, Yee-Yin Choon
National Institute of Standards and Technology

Longitudinal Evaluation Methods in Human-Computer Studies and Visual Analytics
Jens Gerken, Peter Bak, and Harald Reiterer
University of Konstanz

Working documents:

Extracted metrics (v1)
Jean Scholtz

Return to: Main workshop page
VIS 2007


Web Accessibility