CMSC838M: Advanced Topics in Software Testing

Atif M. Memon
Fall 2003

Course Description

This course will examine advanced software testing techniques. In particular, the important phases of testing will be reviewed, emphasizing on the significance of each phase when testing different types of software. Students will learn the state of the art in testing technology for object-oriented, component-based, concurrent, distributed, graphical-user interface, and web software. In addition, closely related concepts such as model checking and program analysis will also be studied. Emerging concepts such as test-case prioritization and their impact on testing will be examined. Students will gain hands-on testing/analysis experience by proposing new solutions to open research problems in the field of software testing and experimentally demonstrating the strengths/weaknesses of their solutions.

By the end of this course, students should be familiar with the state-of-the-art in software testing. Students should also be aware of the major open problems in testing.

Is the course valid for PhD qualifying coursework?

Yes (Software Engineering/Programming Languages)

Is the course valid for MS qualifying coursework?

Yes (Software Engineering/Programming Languages)

Is the course valid for MS comps?

Yes. (Both Midterm and Final exams count towards the MS comps.)

Meeting Times

Tue.Thu - 2:00PM - 3:15PM (CSI 3118)

Office Hours

Tue.Thu - 3:15PM - 5:00PM (4115 A. V. Williams Building)

Prerequisites

CMSC 435 or equivalent.

Schedule and required readings

(Subject to change; additional readings may be assigned)
 

Lecture

#

Date

Topic

Readings (copies available in the instructor’s office)

1, 2

Sep. 2, 4

Course Overview

 

(Slides) (Slides)

 

Testing: a roadmap

Mary Jean Harrold

Proceedings of the conference on The future of Software engineering May 2000

Introduction to special section on software testing

R. Hamlet

Communications of the ACM June 1988
Volume 31 Issue 6

Testing: principles and practice

Stephen R. Schach

ACM Computing Surveys (CSUR) March 1996
Volume 28 Issue 1

Software safety: why, what, and how

Nancy G. Leveson

ACM Computing Surveys (CSUR) June 1986
Volume 18 Issue 2

Validation, Verification, and Testing of Computer Software

W. Richards Adrion , Martha A. Branstad , John C. Cherniavsky

ACM Computing Surveys (CSUR) June 1982
Volume 14 Issue 2

3, 4, 5, 6 7

Sep. 9, 11, 16, 18, 23

Taxonomies of techniques,
 
Test Case Generation

(Slides) (Slides) (Slides) (Slides) (Slides) (Slides)

 

Rethinking the taxonomy of fault detection techniques

Michael Young, Richard N. Taylor

Proceedings of the 11th international conference on Software engineering May 1989

A test generation strategy for pairwise testing
Kuo-Chung Tai; Yu Lei
Software Engineering, IEEE Transactions on , Volume: 28 Issue: 1 , Jan. 2002
Page(s): 109 -111

The category-partition method for specifying and generating fuctional tests

T. J. Ostrand , M. J. Balcer

Communications of the ACM June 1988
Volume 31 Issue 6

Hierarchical GUI test case generation using automated planning
Memon, A.M.; Pollack, M.E.; Soffa, M.L.
Software Engineering, IEEE Transactions on, Volume: 27 Issue: 2, Feb. 2001
Page(s): 144 –155

Using a goal-driven approach to generate test cases for GUIs

Atif M. Memon, Martha E. Pollack, Mary Lou Soffa

Proceedings of the 21st international conference on Software engineering May 1999

Predicate-based test generation for computer programs
Kuo-Chung Tai
Software Engineering, 1993. Proceedings., 15th International Conference on , 1993
Page(s): 267 –276

8

Sep. 25

Test Adequacy and Coverage

(Slides) (Slides) (Slides)

 

 

 

Software unit test coverage and adequacy

Hong Zhu, Patrick A. V. Hall, John H. R. May

ACM Computing Surveys (CSUR) December 1997
Volume 29 Issue 4

The evaluation of program-based software test data adequacy criteria

E. J. Weyuker

Communications of the ACM June 1988 Volume 31 Issue 6

Coverage criteria for GUI testing

Atif M. Memon, Mary Lou Soffa, Martha E. Pollack

ACM SIGSOFT Software Engineering Notes, Proceedings of the 8th European software engineering conference held jointly with 9th ACM SIGSOFT symposium on Foundations of software engineering September 2001
Volume 26 Issue 5

9

Sep. 30

Data-flow Analysis,

Data-flow Testing

 

(Slides)

 

Interprocedual data flow testing

M. Harrold, M. Soffa

ACM SIGSOFT Software Engineering Notes, Proceedings of the ACM SIGSOFT '89 third symposium on Software testing, analysis, and verification November 1989
Volume 14 Issue 8

An applicable family of data flow testing criteria
Frankl, P.G.; Weyuker, E.J.
Software Engineering, IEEE Transactions on, Volume: 14 Issue: 10, Oct. 1988
Page(s): 1483 –1498

10, 11

Oct. 2, 7

Test Oracles

 

(Slides) (Slides)

 

Automated test oracles for GUIs

Atif M. Memon, Martha E. Pollack, Mary Lou Soffa

ACM SIGSOFT Software Engineering Notes, Proceedings of the eighth international symposium on Foundations of software engineering for twenty-first century applications November 2000
Volume 25 Issue 6

Specification-based test oracles for reactive systems

Debra J. Richardson, Stephanie Leif Aha, T. Owen O'Malley

Proceedings of the 14th international conference on Software engineering June 1992

DESIGN AND EMPIRICAL COMPARISON OF MULTIPLE TEST ORACLES FOR GUIs

MS Thesis, Ishan Banerjee

12

Oct. 9

Test Coverage II

 

(Slides)

 

Experiments on the effectiveness of dataflow- and control-flow-based test adequacy criteria
Hutchins, M.; Foster, H.; Goradia, T.; Ostrand, T.
Software Engineering, 1994. Proceedings. ICSE-16., 16th International Conference on, 1994
Page(s): 191 –200

12

Oct. 9

 

Cyntrica Eaton’s Presentation on Web-testing

13, 14

Oct. 14, 16

Regression Testing

(Slides) (Slides)

&

Software Engineering and Testing

(Slides)

 

An empirical study of regression test selection techniques

Todd L. Graves, Mary Jean Harrold, Jung-Min Kim, Adam Porter, Gregg Rothermel

ACM Transactions on Software Engineering and Methodology (TOSEM) April 2001
Volume 10 Issue 2

Regression testing of GUIs

Atif M. Memon, Mary Lou Soffa

September 2003

Proceedings of the 9th European software engineering conference held jointly with 10th ACM SIGSOFT international symposium on Foundations of software engineering

15

Oct. 21

Midterm Exam

 

16

Oct. 23

Student Project Proposals

 

17

Oct. 28

Guest Speaker

Adithya Nagarajan’s presentation on “Multiple test oracles and their effect on fault-detection effectiveness”

18

Oct. 30

Student Topic Presentation

Comparing Testing Techniques

(Student: Srinivas Kashyap)

(get copies of papers from the student)

19

Nov. 4

Student Topic Presentation

Web Testing

(Student: Brian James Krznarich)

(get copies of papers from the student)

20

Nov. 6

Student Topic Presentation

Test-case Prioritization

(Student: Jeffrey Blank)

(get copies of papers from the student)

21

Nov. 11

Student Topic Presentation

Testing Object-oriented Software

(Student: Qing Xie)

(get copies of papers from the student)

22

Nov. 13

Guest Speaker

April Ahn

State of the Art in Automating Usability Evaluation of User Interfaces

23

Nov. 18

Student Topic Presentation

Specifications for Software Testing

(Student: Nikhil Swamy)

(get copies of papers from the student)

24

Nov. 20

 

Projects: Problems and Solutions Session

25

Nov. 25

 

Paper Review

26

Dec. 2

Students’ Project Presentations

 

27

Dec. 4

Students’ Project Presentations

 

28

Dec. 9

 

Paper Review Discussion

29

Dec. 11

Final Exam

 

Student Projects

Students should choose a project with the consent of the instructor. The student(s) may develop a new testing technique for one of the following software:

·        Object-oriented

·        Component-based

·        Concurrent

·       Distributed

·       Graphical-user Interface

·       Web

·       OR suggest your own project.

A one-page project proposal, (due on Oct. 23), summarizing your project and your new technique will be presented (10 minutes) in class.

The student must experimentally demonstrate that the new technique can be successfully used to test the software. A project report must be submitted at the end of the semester. The report should contain the following sections:

1.     Abstract

2.     Introduction (motivation, importance of testing the particular type of software, one paragraph on why existing techniques are insufficient, your technique and its key contributions)

3.     Detailed description of your idea (several sections, if necessary)

4.     Algorithms (pseudo-code) and Experiments (graphs/tables)

5.     Related Work (weaknesses of each related approach)

6.     Conclusions (weaknesses of your technique and possible future directions)

Remember, suggestions for new projects are welcome.

Student Presentations

 

Students must present a set of related papers (perhaps from the list of suggested papers) on one of the following topics:

o      Combining and Comparing Techniques

o      Defect and Failure Estimation and Analysis

o      Model Checking

o      Test Case Prioritization

o      Testing Concurrent Programs

o      Testing Object oriented Software

o      Testing Spreadsheets

o      Web Testing

 

Each presentation will be 75 minutes long, including 10 minutes for questions/discussion. The presentation will be divided into the following parts:

·       Problem definition/motivation

·       What are the challenges?

·       Background literature surveyed by the authors

·       Specific technique developed in these papers

·       Weaknesses of the techniques, future work

 

Students will need to develop a power-point presentation, which will be made available on the course web page at least 2 days before the presentation. The final exam will be based on the contents of these presentations.

Assessment

·       25% Mid-term Exam.

·       25% Final Exam.

·       20% Topic Presentation (40 minutes).

·       5% Project Presentation (10 minutes).

·       25% Term Project (chosen by the student and approved by the instructor. May be a team (2 students) project, depending on its scope).

It is expected that all students understand University policies on academic honesty. Cheating on assignments or exams is very serious and will not be tolerated in this class.   It is permissible to talk to other students about assignments and to discuss particular solutions. However, you are not to share code with anyone except your partner.

Important Dates 

o      Labor Day Holiday 9/1/03 (Monday)

o      Classes Start 9/2/03 (Tuesday)

o      Thanksgiving Holiday 11/27/03 (Thursday) through 11/30/03 (Sunday)

o      Last Class 12/12/03 (Friday)

o      Study Day 12/13/03 or 12/14/03 (Saturday or Sunday)

o      Final Exams Start 12/15/03 (Monday)

o      Final Exams End 12/20/03 (Saturday)