Workshop on the Evaluation of Software Defect Detection Tools

Sunday, June 12th, 2005
Co-located with PLDI 2005

Workshop co-chairs: Bill Pugh (University of Maryland) and Jim Larus (Microsoft Research)

Program chair: Dawson Engler (Stanford University)

Program Committee: Andy Chou, Manuvir Das, Michael Ernst, Cormac Flanagan, Dan Grossman, Jonathan Pincus, Andreas Zeller

Overview

This workshop is intended to provide a forum for the community of researchers and developers of software defect detection tools to discuss issues related to the development and evaluation of these tools. People working on these tools share many common problems and experiences, which typically do not find their way into academic publications or presentations. This community knowledge is invaluable to other tool developers and can help identify new and promising research directions. This workshop provides an opportunity to discuss topics such as:

Important Dates

May 11th: Early registration deadline for workshop and deadline for PLDI rates at hotel
June 6th: Camera-ready PDF versions of all materials to be distributed at workshop emailed to William Pugh
June 12th: Workshop

Workshop Attendance and Registration

The workshop registration is now nearly full, and if you want to register for the workshop you need to contact Bill Pugh. Priority will be given to people who are authors on accepted papers, but anyone else will be put on a waitlist. People will be accepted off of the waitlist May 10th. Register for the workshop (and hotel) through the PLDI conference page.

Format

Speakers giving research presentations should prepare no more than 15 minutes of material. We have allotted an additional 15 minutes for discussion during and after each presentation.

For discussions sessions, several people have been asked to start the discussion with brief presentations/statements. These presentations are to be no longer than 5 minutes; if you need slides during your presentation, you provide PDF files to be loaded onto the presentation machine for the discussion (no laptop switching during presentations). Try to limit any slides to no more than 1 or 2 slides of data relevant for your presentation (no bullet points). The remainder of the discussion time is for open discussion; anyone is welcome to join in.

Proceedings

One PDF document containing all presentations and position papers

One zip file document containing all individual presentations and position papers

Hard and electronic copies of all research presentations and position papers will be distributed at the meeting.

The conference web page will make available all of the position papers, as well as notes from the conference and a collected bibliography on software defect detection tools.

Although the conference web page will list research presentations at the workshop, the conference web page will not host those papers. Authors who wish to make those presentations available should host them from their own web sites. This is to ensure that the workshop is correctly understood to be an informal workshop, and that presentation of research at the workshop is not considered a barrier to republication of that research in conferences.

Schedule: Sunday, June 12th

This schedule is preliminary; the order of papers/statements in sections may change, and the affiliation of authors is incomplete.

TimeWhat
8:30 am

Discussion on Soundness

  • The Soundness of Bugs is What Matters, Patrice Godefroid, Bell Laboratories, Lucent Technologies
  • Soundness and its Role in Bug Detection Systems, Yichen Xie, Mayur Naik, Brian Hackett, Alex Aiken, Stanford University
9:15 am

break

9:30 am

Research presentations

  • Locating Matching Method Calls by Mining Revision History Data, Benjamin Livshits, Thomas Zimmermann, Stanford University
  • Evaluating a Lightweight Defect Localization Tool, Valentin Dallmeier, Christian Lindig, Andreas Zeller, Saarland University
10:30 am

break

10:45 am

Defect Detection at Microsoft - Where the Rubber Meets the Road, Manuvir Das, Center for Software Excellence, Microsoft

11:15 am

Discussion of Deployment and Adoption

  • The Open Source Proving Grounds, Ben Liblit, University of Wisconsin-Madison
  • Issues in deploying SW defect detection tools, David Cok, Eastman Kodak R&D
  • False Positives Over Time: A Problem in Deploying Static Analysis Tools, Andy Chou, Coverity
12 noon

lunch

1:00 pm

Research presentations

  • Model Checking x86 Executables with CodeSurfer/x86 and WPDS++, Gogul Balakrishnan, Thomas Reps, Nick Kidd, Akash Lal, Junghee Lim, David Melski, Radu Gruian, Suan Yong, Chi-Hua Chen, Tim Teitelbaum, Univ. of Wisconsin
  • Empowering Software Debugging Through Architectural Support for Program Rollback, Radu Teodorescu, Josep Torrellas, UIUC Computer Science
  • EXPLODE: A Lightweight, General Approach to Finding Serious Errors in Storage Systems, Junfeng Yang, Paul Twohey, Ben Pfaff, Can Sar, Dawson Engler, Stanford University
2:30 pm

break

2:45 pm

Research presentations

  • Experience from Developing the Dialyzer: A Static Analysis Tool Detecting Defects in Erlang Applications, Kostis Sagonas, Uppsala University
  • Soundness by Static Analysis and False-alarm Removal by Statistical Analysis: Our Airac Experience, Yungbum Jung, Jaehwang Kim, Jaeho Sin, Kwangkeun Yi, Seoul National University
3:45 pm

break

4:00 pm

Discussion of Benchmarking

  • Dynamic Buffer Overflow Detection, Michael Zhivich, Tim Leek, Richard Lippmann, MIT Lincoln Laboratory
  • Using a Diagnostic Corpus of C Programs to Evaluate Buffer Overflow Detection by Static Analysis Tools, Kendra Kratkiewicz, Richard Lippmann, MIT Lincoln Laboratory
  • BugBench: A Benchmark for Evaluating Bug Detection Tools, Shan Lu, Zhenmin Li, Feng Qin, Lin Tan, Pin Zhou, Yuanyuan Zhou, UIUC
  • Benchmarking Bug Detection Tools. Roger Thornton, Fortify Software
  • A Call for a Public Bug and Tool Registry, Jeffrey Foster, Univ. of Maryland
  • Bug Specimens are Important, Jaime Spacco, David Hovemeyer, William Pugh, University Maryland
  • NIST Software Assurance Metrics and Tool Evaluation (SAMATE) Project, Michael Kass, NIST
5:00 pm

Discussion of New Ideas

  • Deploying Architectural Support for Software Defect Detection in Future Processors, Yuanyuan Zhou, Josep Torrellas, UIUC
  • Using Historical Information to Improve Bug Finding Techniques, Chadd Williams, Jeffrey Hollingsworth, Univ. of Maryland
  • Locating defects is uncertain, Andreas Zeller, Saarland University
  • Is a Bug Avoidable and How Could It Be Found?, Dan Grossman, Univ. of Washington
5:45 pm

wrap up and discussion of future workshops

6:00 pm

done