CMSC838M: Advanced Topics in Software Testing
This course will examine advanced software testing techniques. In particular, the important phases of testing will be reviewed, emphasizing on the significance of each phase when testing different types of software. Students will learn the state of the art in testing technology for object-oriented, component-based, concurrent, distributed, graphical-user interface, and web software. In addition, closely related concepts such as model checking and program analysis will also be studied. Emerging concepts such as test-case prioritization and their impact on testing will be examined. Students will gain hands-on testing/analysis experience by proposing new solutions to open research problems in the field of software testing and experimentally demonstrating the strengths/weaknesses of their solutions.
By the end of this course, students should be familiar with the state-of-the-art in software testing. Students should also be aware of the major open problems in testing.
Is the course valid for PhD qualifying coursework? |
Yes (Software Engineering/Programming Languages) |
Is the course valid for MS qualifying coursework? |
Yes (Software Engineering/Programming Languages) |
Is the course valid for MS comps? |
Yes. (Both Midterm and Final exams count towards the MS comps.) |
Tue-Thu --
2:00PM - 3:15PM (4424 A.V. Williams Building)
CMSC 435 or equivalent.
(Subject to change; additional readings may be assigned)
Week |
Dates |
Topic |
Readings (copies available
in the instructor’s office) |
1 |
Aug. 29-31 |
Course Overview |
|
2 |
Sep. 3-7 |
Introduction to Testing (preliminary concepts – testability, whitebox, blackbox, unit, integration, mutation) |
|
3 |
Sep. 10-14 |
ESEC/FSE-9 (No classes) |
|
4 |
Sep. 17-21 |
Test-case Generation |
|
5 |
Sep. 24-28 |
Test Coverage |
|
6 |
Oct. 1-5 |
Test Oracles |
|
7 |
Oct. 8-12 |
Regression Testing |
|
8 |
Oct. 15-19 |
Data-flow Analysis, Data-flow Testing |
· M. Harrold and M. Soffa; Interprocedual data flow testing; Proceedings of the ACM SIGSOFT '89 third symposium on Software testing, analysis, and verification, 1989, Pages 158 – 167. · Frankl, P.G.; Weyuker, E.J., An applicable family of data flow testing criteria, Software Engineering, IEEE Transactions on, Volume: 14 Issue: 10, Oct. 1988, Page(s): 1483 –1498. · Weyuker, E.J., The cost of data flow testing: an empirical study, Software Engineering, IEEE Transactions on, Volume: 16 Issue: 2, Feb. 1990, Page(s): 121 –128. · Frankl, P.G.; Weyuker, E.J., An analytical comparison of the fault-detecting ability of data flow testing techniques, Software Engineering 1993. Proceedings, 15th International Conference on, 1993, Page(s): 415 –424. · Weyuker, E.J., More experience with data flow testing, Software Engineering, IEEE Transactions on, Volume: 19 Issue: 9, Sept. 1993, Page(s): 912 –919. · Parrish, A.S.; Zweben, S.H., On the relationships among the all-uses, all-DU-paths, and all-edges testing criteria, Software Engineering, IEEE Transactions on, Volume: 21 Issue: 12, Dec. 1995, Page(s): 1006 –1009. |
9 |
Oct. 22-26 |
Mid-term Exam, Project Proposals |
|
10 |
Oct. 29-Nov. 2 |
Student Presentations: Object-oriented Software, Component-based Software |
Oct. 30, 2:05PM: Huo Yan Chen, T. H. Tse, F. T. Chan and T. Y. Chen; In black and white: an integrated approach to class-level testing of object-oriented programs; ACM Trans. Softw. Eng. Methodol. 7, 3 (Jul. 1998), Pages 250 – 295. [PRESENTER: Renars Gailis renars@cs.umd.edu] Oct. 30, 2:40PM: Roong-Ko Doong and Phyllis G. Frankl; The ASTOOT approach to testing object-oriented programs; ACM Trans. Softw. Eng. Methodol. 3, 2 (Apr. 1994), Pages 101 – 130. . [PRESENTER: Lingling Zhang lingz@cs.umd.edu] Nov. 01, 2:00PM: Weyuker, E.J., Testing component-based software: a cautionary tale, IEEE Software, Volume: 15 Issue: 5, Sept.-Oct. 1998, Page(s): 54–59. [PRESENTER: Y.C. Justin Wan ycwan@cs.umd.edu] Nov. 01, 2:30PM: Y. Labiche, P. Thévenod-Fosse, H. Waeselynck and M.H. Durand, Testing levels for object-oriented software, Proceedings of the 22nd International Conference on Software engineering, 2000, Pages 136-145. [PRESENTER: Sasan Dashtinezhad sasan@cs.umd.edu]. Nov. 01, 3:00PM: Paul C. Jorgensen and Carl Erickson, Object-oriented integration testing, Comm. ACM 37, 9 (Sep. 1994), Pages 30 – 38. [PRESENTER: Yang Hedong hedong@wam.umd.edu]. |
11 |
Nov. 5-9 |
Guest Speaker: Tuesday,
November 6th Arkady Pogostkin, AOL
Technologies 2:00pm, 3258 A.V.
Williams Bldg. & Student Presentations: Concurrent Software, Distributed Software |
Nov. 08, 2:05PM: Gwan-Hwan Hwang; Kuo-Chung Tai; Ting-Lu Huang, Reachability testing: an approach to testing concurrent software, Software Engineering Conference, 1994. Proceedings., 1994 First Asia-Pacific , 1994, Page(s): 246 –255. [PRESENTER: Vijay Gopalakrishnan gvijay@cs.umd.edu] Nov. 08, 2:40PM: Gregor V. Bochmann and Alexandre Petrenko, Protocol testing: review of methods and relevance for software testing, Proceedings of the 1994 international symposium on Software testing and analysis, 1994, Pages 109–124. [PRESENTER: Arunesh Mishra arunesh@cs.umd.edu]. |
12 |
Nov. 12-16 |
Student Presentations: Graphical-user Interfaces, Web Applications |
Nov. 13, 2:05PM: Atif M. Memon, Mary Lou Soffa and Martha E. Pollack, Coverage Criteria for GUI Testing, 8th European Software Engineering Conference (ESEC) and 9th ACM SIGSOFT International Symposium on the Foundations of Software Engineering (FSE-9), Vienna University of Technology, Austria, Sept. 10-14, 2001. [PRESENTER: Cassie Thomas cassie@cs.umd.edu] Nov. 13, 2:40PM: Ricca, F.; Tonella, P., Analysis and testing of web applications, Software Engineering, 2001. ICSE 2001. Proceedings of the 23rd International Conference on, 2001, Page(s): 25 –34. [PRESENTER: Shang C. Wu meou@cs.umd.edu] Nov. 15, 2:05PM: Memon, A.M.; Pollack, M.E.; Soffa, M.L., Hierarchical GUI test case generation using automated planning, Software Engineering, IEEE Transactions on, Volume: 27 Issue: 2, February 2001, Page(s): 144 –155. [PRESENTER: Narendar Shankar narendar@cs.umd.edu] Nov. 15, 2:40PM: Kallepalli, C.; Tian, J., Usage measurement for statistical web testing and reliability analysis, Software Metrics Symposium, 2001. METRICS 2001. Proceedings. Seventh International, 2000 Page(s): 148 –158. . [PRESENTER: Srinivasan Parthasarathy sri@cs.umd.edu]. |
13 |
Nov. 19-23 |
Student Presentations: Database Applications, Model Checking |
Nov. 20, 2:05PM: David Chays, Saikat Dan, and Phyllis G. Frankl (Polytechnic University), Filippos Vokolos (Lucent Technologies), Elaine J. Weyuker (AT&T Labs Research), A Framework for Testing Database Applications, International Symposium on Software Testing and Analysis, 22-25 August 2000, pages 147-157. [PRESENTER: Edward Hung ehung@cs.umd.edu] Nov. 20, 2:40PM: Daniel Jackson, Abstract Model Checking of Infinite Specifications, Proc. Formal Methods Europe, Barcelona, October 1994. [PRESENTER: Lida ltang@cs.umd.edu] |
14 |
Nov. 26-30 |
Student Presentations: Concurrent Software, Distributed Software & Student Project Presentations |
Nov. 27, 2:00PM: Kuo-Chung Tai; Karacali, B., On Godefroid's state-less search technique for testing concurrent programs, Proceedings of the 5th International Symposium on Autonomous Decentralized Systems 2001, Page(s): 77 –84. [PRESENTER: Ishan Banerjee ishan@cs.umd.edu] Nov. 27, 2:30PM: Harrold, M.J., Malloy, B.A., Data flow testing of parallelized code, Software Maintenance, 1992. Proceedings., Conference on, 1992, Page(s): 272 –281. [PRESENTER: Vinay Shet vinay@cs.umd.edu] Nov. 27, 3:00PM: J. Fagerström, Design And Test Of Distributed Applications, 10th International Conference on Software Engineering, 1988, Pages 88–92. [PRESENTER: Tamer Elsharnouby sharno@cs.umd.edu]. |
15 |
Dec. 3-7 |
Student Project Presentations |
|
16 |
Dec. 10-14 |
Student Project Presentations & Final Exam |
|
Students should choose a project with the consent of the instructor. The student(s) must develop a new testing technique for one of the following software:
· Object-oriented
· Component-based
· Concurrent
· Distributed
· Graphical-user Interface
· Web
·
OR suggest your own project.
A one-page project proposal, (due during the week of Oct. 22-26), summarizing your project and your new technique will be presented (5 minutes) in class.
The student must experimentally demonstrate that the new technique can be successfully used to test the software. A project report must be submitted at the end of the semester. The report should contain the following sections:
1. Abstract
2. Introduction (motivation, importance of testing the particular type of software, one paragraph on why existing techniques are insufficient, your technique and its key contributions)
3. Detailed description of your idea (several sections, if necessary)
4. Algorithms (pseudo-code) and Experiments (graphs/tables)
5. Related Work (weaknesses of each related approach)
6. Conclusions (weaknesses of your technique and possible future directions)
Remember, suggestions for new projects are welcome.
Students must present a paper (perhaps from the list of suggested papers) on one of the following topics:
· Prioritizing Test Cases
· Model Checking
· Testing Object-oriented Software
· Testing Component-based Software
· Testing Concurrent Software
· Testing Distributed Software
· Testing Graphical-user Interfaces
· Testing Web Applications
Each presentation will be 30 minutes long, including 5 minutes for questions/discussion. The presentation will be divided into the following parts:
· Problem definition/motivation
· What are the challenges?
· Background literature surveyed by the authors
· Specific technique developed in this paper
· Weaknesses of the technique
Since two students will present two different papers from each topic on the same day, they may work together to avoid repetitions. Students will need to develop a power-point presentation, which will be made available on the course web page at least 2 days before the presentation. The final exam will be based on the contents of these presentations.
· 25% Mid-term Exam.
· 25% Final Exam.
· 20% Topic Presentation (40 minutes).
· 5% Project Presentation (10 minutes).
·
25% Term Project (chosen by the student and approved by
the instructor. May be a team (2 students) project, depending on the scope of
the project).
It is expected that all students understand University policies on academic honesty. Cheating on assignments or exams is very serious and will not be tolerated in this class. It is permissible to talk to other students about assignments and to discuss particular solutions. However, you are not to share code with anyone except your partner.
·
**Classes Start 8/29/01
(Wednesday)
·
**Labor Day Holiday
9/3/01 (Monday)
·
**Thanksgiving Holiday
11/22/01 (Thursday) through 11/25/01 (Sunday)
·
Last Class 12/11/01
(Tuesday)
·
Study Day 12/12/01
(Wednesday)
·
Final Exams Start
12/13/01 (Thursday)
·
Final Exams End
12/19/01 (Wednesday)
·
Winter Commencement
12/20/01 (Thursday)