CMSC838M: Advanced Topics in Software Testing

Atif M. Memon
Fall 2001

Course Description

This course will examine advanced software testing techniques. In particular, the important phases of testing will be reviewed, emphasizing on the significance of each phase when testing different types of software. Students will learn the state of the art in testing technology for object-oriented, component-based, concurrent, distributed, graphical-user interface, and web software. In addition, closely related concepts such as model checking and program analysis will also be studied. Emerging concepts such as test-case prioritization and their impact on testing will be examined. Students will gain hands-on testing/analysis experience by proposing new solutions to open research problems in the field of software testing and experimentally demonstrating the strengths/weaknesses of their solutions.

By the end of this course, students should be familiar with the state-of-the-art in software testing. Students should also be aware of the major open problems in testing.

Is the course valid for PhD qualifying coursework?

Yes (Software Engineering/Programming Languages)

Is the course valid for MS qualifying coursework?

Yes (Software Engineering/Programming Languages)

Is the course valid for MS comps?

Yes. (Both Midterm and Final exams count towards the MS comps.)

Meeting Times

Tue-Thu -- 2:00PM - 3:15PM (4424 A.V. Williams Building)

Prerequisites

CMSC 435 or equivalent.

Schedule and required readings

(Subject to change; additional readings may be assigned)
 

Week

Dates

Topic

Readings (copies available in the instructor’s office)

1

Aug. 29-31

Course Overview

 

2

Sep. 3-7

Introduction to Testing (preliminary concepts – testability, whitebox, blackbox, unit, integration,

mutation)

  • Adrion; Branstadt; Cherniavsky, Validation, Verification and Testing of Computer Software, ACM Computing Surveys 14(2): 159-192, June 1982.
  • Stephen R. Schach, Testing: principles and practice, ACM Computing Surveys; Volume 28, Issue 1 (1996).
  • M. Young; R.N. Taylor. Rethinking the Taxonomy of Fault Detection Techniques, in Proceedings of the 11th International Conference on Software Engineering, pp. 53-62, Pittsburgh, PA, May 1989.
  • N.G. Leveson, Software Safety: What, Why, and How, ACM Computing Surveys; 18(2): 125-163, June 1986.

3

Sep. 10-14

ESEC/FSE-9 (No classes)

 

4

Sep. 17-21

Test-case Generation

  • T.J. Ostrand; M.J. Balcer, The Category-Partition Method for Specifying and Generating Functional Tests, Communications of the ACM; 31(6):676-686, June 1988.
  • Adele Howe, Anneliese von Mayrhauser and Richard Mraz, Test Case Generation as an AI Planning Problem, Automated Software Engineering, Vol.4, No.1, pp. 77-106, January, 1997.
  • Neelam Gupta, Aditya P. Mathur, and Mary Lou Soffa, Automated Test Data Generation Using an Iterative Relaxation Method, ACM SIGSOFT Sixth International Symposium on Foundations of Software Engineering (FSE-6), Orlando, Florida, USA, November 1998.

5

Sep. 24-28

Test Coverage

  • E.J. Weyuker. The evaluation of program-based software test data adequacy criteria, Commun. ACM 31, 6 (Jun. 1988), Pages 668 - 675.
  • Frankl, P.G.; Weyuker, E.J., Provable improvements on branch testing, Software Engineering, IEEE Transactions on, Volume: 19 Issue: 10, Oct. 1993, Page(s): 962 –975.
  • Hong Zhu, Patrick A. V. Hall and John H. R. May; Software unit test coverage and adequacy; ACM Comput. Surv. 29, 4 (Dec. 1997), Pages 366 – 427.

6

Oct. 1-5

Test Oracles

  • D.J. Richardson, TAOS: Testing with Analysis and Oracle Support, in Proceedings of the 1994 International Symposium on Software Testing and Analysis, pp. 138-153, Seattle, WA, August 1994.
  • Atif M. Memon, Martha E. Pollack and Mary Lou Soffa, Automated test oracles for GUIs, Proceedings of the eighth international symposium on Foundations of software engineering, 2000, Pages 30 – 39.
  • Debra J. Richardson, Stephanie Leif Aha and T. Owen O'Malley, Specification-based test oracles for reactive systems, Proceedings of the 14th International conference on Software engineering, 1992, Pages 105 - 118

7

Oct. 8-12

Regression Testing

  • D.S. Rosenblum and E.J. Weyuker. Using Coverage Information to Predict the Cost-Effectiveness of Regression Testing Strategies, IEEE Transactions on Software Engineering; 23(3): 146-156, March 1997.
  • Todd L. Graves, Mary Jean Harrold, Jung-Min Kim, Adam Porter and Gregg Rothermel, An empirical study of regression test selection techniques, ACM Trans. Softw. Eng. Methodol. 10, 2 (Apr. 2001), Pages 184 – 208.
  • Gregg Rothermel and Mary Jean Harrold; A framework for evaluating regression test selection techniques; Proceedings Of The 16th International Conference On Software Engineering, 1994, Pages 201 – 210
  • Gregg Rothermel and Mary Jean Harrold; A safe, efficient regression test selection technique; ACM Trans. Softw. Eng. Methodol. 6, 2 (Apr. 1997), Pages 173 – 210

8

Oct. 15-19

Data-flow Analysis,

Data-flow Testing

 

·       M. Harrold and M. Soffa; Interprocedual data flow testing; Proceedings of the ACM SIGSOFT '89 third symposium on Software testing, analysis, and verification, 1989, Pages 158 – 167.

·       Frankl, P.G.; Weyuker, E.J., An applicable family of data flow testing criteria, Software Engineering, IEEE Transactions on, Volume: 14 Issue: 10, Oct. 1988, Page(s): 1483 –1498.

·       Weyuker, E.J., The cost of data flow testing: an empirical study, Software Engineering, IEEE Transactions on, Volume: 16 Issue: 2, Feb. 1990, Page(s): 121 –128.

·       Frankl, P.G.; Weyuker, E.J., An analytical comparison of the fault-detecting ability of data flow testing techniques, Software Engineering 1993. Proceedings, 15th International Conference on, 1993, Page(s): 415 –424.

·       Weyuker, E.J., More experience with data flow testing, Software Engineering, IEEE Transactions on, Volume: 19 Issue: 9, Sept. 1993, Page(s): 912 –919.

·       Parrish, A.S.; Zweben, S.H., On the relationships among the all-uses, all-DU-paths, and all-edges testing criteria, Software Engineering, IEEE Transactions on, Volume: 21 Issue: 12, Dec. 1995, Page(s): 1006 –1009.

9

Oct. 22-26

Mid-term Exam,

Project Proposals

 

10

Oct. 29-Nov. 2

Student Presentations: Object-oriented Software, Component-based Software

Oct. 30, 2:05PM:  Huo Yan Chen, T. H. Tse, F. T. Chan and T. Y. Chen; In black and white: an integrated approach to class-level testing of object-oriented programs; ACM Trans. Softw. Eng. Methodol. 7, 3 (Jul. 1998), Pages 250 – 295. [PRESENTER: Renars Gailis renars@cs.umd.edu]

Oct. 30, 2:40PM:  Roong-Ko Doong and Phyllis G. Frankl; The ASTOOT approach to testing object-oriented programs; ACM Trans. Softw. Eng. Methodol. 3, 2 (Apr. 1994), Pages 101 – 130. . [PRESENTER: Lingling Zhang lingz@cs.umd.edu]

Nov. 01, 2:00PM: Weyuker, E.J., Testing component-based software: a cautionary tale, IEEE Software, Volume: 15 Issue: 5, Sept.-Oct. 1998, Page(s): 54–59. [PRESENTER: Y.C. Justin Wan ycwan@cs.umd.edu]

Nov. 01, 2:30PM: Y. Labiche, P. Thévenod-Fosse, H. Waeselynck and M.H. Durand, Testing levels for object-oriented software, Proceedings of the 22nd International Conference on Software engineering, 2000, Pages 136-145. [PRESENTER: Sasan Dashtinezhad sasan@cs.umd.edu].

Nov. 01, 3:00PM: Paul C. Jorgensen and Carl Erickson, Object-oriented integration testing, Comm. ACM 37, 9 (Sep. 1994), Pages 30 – 38. [PRESENTER: Yang Hedong hedong@wam.umd.edu].

11

Nov. 5-9

Guest Speaker: Tuesday, November 6th

Arkady Pogostkin, AOL Technologies

2:00pm, 3258 A.V. Williams Bldg.

&

Student Presentations: Concurrent Software, Distributed Software

Nov. 08, 2:05PM: Gwan-Hwan Hwang; Kuo-Chung Tai; Ting-Lu Huang, Reachability testing: an approach to testing concurrent software, Software Engineering Conference, 1994. Proceedings., 1994 First Asia-Pacific , 1994, Page(s): 246 –255. [PRESENTER: Vijay Gopalakrishnan gvijay@cs.umd.edu]

Nov. 08, 2:40PM: Gregor V. Bochmann and Alexandre Petrenko, Protocol testing: review of methods and relevance for software testing, Proceedings of the 1994 international symposium on Software testing and analysis, 1994, Pages 109–124. [PRESENTER: Arunesh Mishra arunesh@cs.umd.edu].

12

Nov. 12-16

Student Presentations: Graphical-user Interfaces, Web Applications

Nov. 13, 2:05PM: Atif M. Memon, Mary Lou Soffa and Martha E. Pollack, Coverage Criteria for GUI Testing, 8th European Software Engineering Conference (ESEC) and 9th ACM SIGSOFT International Symposium on the Foundations of Software Engineering (FSE-9), Vienna University of Technology, Austria, Sept. 10-14, 2001. [PRESENTER: Cassie Thomas cassie@cs.umd.edu]

Nov. 13, 2:40PM: Ricca, F.; Tonella, P., Analysis and testing of web applications, Software Engineering, 2001. ICSE 2001. Proceedings of the 23rd International Conference on, 2001, Page(s): 25 –34. [PRESENTER: Shang C. Wu meou@cs.umd.edu]

Nov. 15, 2:05PM: Memon, A.M.; Pollack, M.E.; Soffa, M.L., Hierarchical GUI test case generation using automated planning, Software Engineering, IEEE Transactions on, Volume: 27 Issue: 2, February 2001, Page(s): 144 –155. [PRESENTER: Narendar Shankar narendar@cs.umd.edu]

Nov. 15, 2:40PM: Kallepalli, C.; Tian, J., Usage measurement for statistical web testing and reliability analysis, Software Metrics Symposium, 2001. METRICS 2001. Proceedings. Seventh International, 2000 Page(s): 148 –158. . [PRESENTER: Srinivasan Parthasarathy sri@cs.umd.edu].

13

Nov. 19-23

Student Presentations: Database Applications,

Model Checking

Nov. 20, 2:05PM: David Chays, Saikat Dan, and Phyllis G. Frankl (Polytechnic University), Filippos Vokolos (Lucent Technologies), Elaine J. Weyuker (AT&T Labs Research), A Framework for Testing Database Applications, International Symposium on Software Testing and Analysis, 22-25 August 2000, pages 147-157. [PRESENTER: Edward Hung ehung@cs.umd.edu]

Nov. 20, 2:40PM: Daniel Jackson, Abstract Model Checking of Infinite Specifications, Proc. Formal Methods Europe, Barcelona, October 1994. [PRESENTER: Lida ltang@cs.umd.edu]

14

Nov. 26-30

Student Presentations: Concurrent Software, Distributed Software

&

Student Project Presentations

Nov. 27, 2:00PM: Kuo-Chung Tai; Karacali, B., On Godefroid's state-less search technique for testing concurrent programs, Proceedings of the 5th International Symposium on Autonomous Decentralized Systems 2001, Page(s): 77 –84. [PRESENTER: Ishan Banerjee ishan@cs.umd.edu]

Nov. 27, 2:30PM: Harrold, M.J., Malloy, B.A., Data flow testing of parallelized code, Software Maintenance, 1992. Proceedings., Conference on, 1992, Page(s): 272 –281. [PRESENTER: Vinay Shet vinay@cs.umd.edu]

Nov. 27, 3:00PM: J. Fagerström, Design And Test Of Distributed Applications, 10th International Conference on Software Engineering, 1988, Pages 88–92. [PRESENTER: Tamer Elsharnouby sharno@cs.umd.edu].

15

Dec. 3-7

Student Project Presentations

 

16

Dec. 10-14

Student Project Presentations

&

 Final Exam

  • Material discussed after the mid-term.

Lecture Slides (pdf)

·      Week1

·      Week2

·      Week4

·      Week5

·      Week6

·      Week7

·      Week8

 

Student Presentations (ppt)

·      Week10 (Renars, Lingling, Justin, Sasan, Hedong)

·      Week11 (Vijay, Arunesh)

·      Week12 (Cassie, Shang, Narendar, Srinivasan)

·      Week13 (Edward, Lida)

·      Week14 (Ishan, Vinay, Tamer)

Student Projects

Students should choose a project with the consent of the instructor. The student(s) must develop a new testing technique for one of the following software:

·        Object-oriented

·        Component-based

·        Concurrent

·       Distributed

·       Graphical-user Interface

·       Web

·       OR suggest your own project.

A one-page project proposal, (due during the week of Oct. 22-26), summarizing your project and your new technique will be presented (5 minutes) in class.

The student must experimentally demonstrate that the new technique can be successfully used to test the software. A project report must be submitted at the end of the semester. The report should contain the following sections:

1.     Abstract

2.     Introduction (motivation, importance of testing the particular type of software, one paragraph on why existing techniques are insufficient, your technique and its key contributions)

3.     Detailed description of your idea (several sections, if necessary)

4.     Algorithms (pseudo-code) and Experiments (graphs/tables)

5.     Related Work (weaknesses of each related approach)

6.     Conclusions (weaknesses of your technique and possible future directions)

Remember, suggestions for new projects are welcome.

Student Presentations

 

Students must present a paper (perhaps from the list of suggested papers) on one of the following topics:

·       Prioritizing Test Cases

·       Model Checking

·       Testing Object-oriented Software

·       Testing Component-based Software

·       Testing Concurrent Software

·       Testing Distributed Software

·       Testing Graphical-user Interfaces

·       Testing Web Applications

 

Each presentation will be 30 minutes long, including 5 minutes for questions/discussion. The presentation will be divided into the following parts:

·       Problem definition/motivation

·       What are the challenges?

·       Background literature surveyed by the authors

·       Specific technique developed in this paper

·       Weaknesses of the technique

 

Since two students will present two different papers from each topic on the same day, they may work together to avoid repetitions. Students will need to develop a power-point presentation, which will be made available on the course web page at least 2 days before the presentation. The final exam will be based on the contents of these presentations.

Assessment

·       25% Mid-term Exam.

·       25% Final Exam.

·       20% Topic Presentation (40 minutes).

·       5% Project Presentation (10 minutes).

·       25% Term Project (chosen by the student and approved by the instructor. May be a team (2 students) project, depending on the scope of the project).

It is expected that all students understand University policies on academic honesty. Cheating on assignments or exams is very serious and will not be tolerated in this class.   It is permissible to talk to other students about assignments and to discuss particular solutions. However, you are not to share code with anyone except your partner.

Important Dates – Fall 2001

·       **Classes Start 8/29/01 (Wednesday)

·       **Labor Day Holiday 9/3/01 (Monday)

·       **Thanksgiving Holiday 11/22/01 (Thursday) through 11/25/01 (Sunday)

·       Last Class 12/11/01 (Tuesday)

·       Study Day 12/12/01 (Wednesday)

·       Final Exams Start 12/13/01 (Thursday)

·       Final Exams End 12/19/01 (Wednesday)

·       Winter Commencement 12/20/01 (Thursday)