Software Engineering

Spring 2010; CMSC 435; Section 0101

horizontal rule

Home
Course Information
Class Schedule
Course Readings
Quizzes
Project
GUITAR Web-site
Atif M. Memon's Page
Send Atif an e-mail
brobbins@umd.edu?subject=CMSC 435 Help Needed:

Project

Welcome to TerpSoft!

By registering for this class, you have automatically become part of a pseudo-company called TerpSoft. This company develops software according to the Software Engineering principles taught in this course. Your customer is the course instructor assisted by the class TA. Your job is to complete the software engineering tasks required by the customer. Because you are part of TerpSoft, YOU need to take initiative to get the project started, make progress, and complete it on time. Schedule meetings with the course instructor and TA so that you get a clear description of the requirements for each project phase.

Before you start, PLEASE READ

bullet

IMPORTANT: As part of this project, you will be expected to use buggy 3rd-party software. This is done deliberately to simulate real-life scenarios. Students are expected to work around the bugs and submit their projects on time.

bulletLate deliverables will be accepted with a 20% penalty per day. Start your projects early - last-minute computer malfunctions will not be accepted as reason for delaying an assignment's due date.
bulletAll teams will present their project in class.
bulletThe university computer labs should provide all necessary support for the project. Other resources normally available to you (e.g., home computers) can be employed, however you do this is "at your own risk." No alterations to conditions of the assignment will be made to accommodate peculiarities of your other computing resources.

Overview of customer's requirements

In Fall 2009, I had asked a software company to develop a system called GUITAR (http://guitar.sourceforge.net/), a system that was required to have six applications (JFCGUIRipper, GUIStructure2EFG, TestCaseGenerator, JFCGUIReplayer, WindowsGUIRipper, WindowsGUIReplayer). Because they did not use good software engineering practices, the company was unable to deliver a good product. Only the first four applications (JFCGUIRipper, GUIStructure2EFG, TestCaseGenerator, JFCGUIReplayer) were completed but without adequate documentation. The remaining two applications (WindowsGUIRipper, WindowsGUIReplayer) remained in prototype stage, in C#. Now I am hiring TerpSoft to fix some of the problems.

Project Phases

Phase 1 (to be done individually); due date: Feb. 5, 2010.

Goal: Downloading and running all the applications on a simple input.
Procedure: In this phase, you need to demonstrate that you are able to run all six applications (please see your course instructor for the inputs to these applications). The source code for all these applications resides in a Subversion repository that can be viewed at http://guitar.svn.sourceforge.net/viewvc/guitar/. Please refer to https://sourceforge.net/projects/guitar/develop for additional help with Subversion. Each application (lets call it foo for fun) resides in the folder foo.tool/ in the repository. Checkout this folder, for each application, and read its README.txt file. You will find more information about the modules that foo uses for building and execution. You will also find instructions on how to build and run foo.
Deliverables: There are no deliverables for this phase. Points will be awarded for demos. (But maintain a list of problems that you encountered with the tools during this phase. You will need it in a later phase.)
Additional reading(s): http://www.cs.umd.edu/~atif/papers/MemonSTVR2007-abstract.html

Grading session: During the grading session, you will have access to a new machine (Windows or Linux) that is connected to the Internet. You will have to download, build, and execute all six applications and run them on all the inputs. You will also need to install Ant, Subversion, Java, C++, and any other tools needed to run the applications. Each application is worth 15 points.

Points: 90

We will now start using fixed teams, where each member will be assigned to a particular team. Phases 2 and 3 will be graded based on the team. This is used to simulate an Industry scenario in which you are hired and assigned to a specific team.

 

 

 

 

 

Abdelsalam, Mahmoud Mohame

 

 

 

 

 

Abdullah, Mustafa

 

 

 

 

 

Apau, Emmanuel Twum

 

 

 

 

 

Bhatt, Arya

 

 

 

 

 

Braun, Michael Richard

 

 

 

 

 

Cartas, Anika Silvina

 

 

 

 

 

Chitlur Lakshman, Deepak

 

 

 

 

 

Craciunescu, Cosmin Octavi

 

 

 

 

 

Dattilio, Patrick Christop

 

 

 

 

 

de Castro, Fernando Corra

 

 

 

 

 

Egan, Thomas Michael

 

 

 

 

 

Goldin, Paul

 

 

 

 

 

Greene, Jared Elliot

 

 

 

 

 

Isaac, Joseph

 

 

 

 

 

Jamison, Kasey Lynn

 

 

 

 

 

Kaminski, Christopher Thom

 

 

 

 

 

Kay, Jonathan Lee

 

 

 

 

 

Khan, Kashif Maqsood

 

 

 

 

 

Kraft, Gregory Kenneth

 

 

 

 

 

Lam, Pui Ying

 

 

 

 

 

Locke, John Bradford

 

 

 

 

 

Lockyear, Alexander Garber

 

 

 

 

 

Miluski, Matthew Scott

 

 

 

 

 

Musavi, Omeed

 

 

 

 

 

Muthara, Meria

 

 

 

 

 

Naegele, Joseph Daniel

 

 

 

 

 

Rau, Jonathan Frank

 

 

 

 

 

Schoenfeld, Andrew N

 

 

 

 

 

Sethi, Nidhi

 

 

 

 

 

Smith, Bria Danielle

 

 

 

 

 

Solomon, Joseph Michael

 

 

 

 

 

Sternberg, Jonathan Alan

 

 

 

 

 

Verner, Arseni

 

 

 

 

 

Woo, Kevin J

 

 

 

 

 

Yohannes, Mekias Mebrahtu

 

 

 

 

 

Young, Jason

C++

Ripper

EFG

TCG

Replayer

 

 

Phase 2 (to be done in a team); due date: Feb. 26, 2010. (NOTE: this phase does not apply to the Windows Ripper/Replayer team. See Phase 2 requirements below.)

Goal: Understanding the requirements of your project and trying out more complex inputs.
Procedure: Schedule an interview (or multiple interviews) with the customer  to understand the requirements. Also talk to the developer to get an idea of the system architecture, coding conventions, data formats, and other implementation details.
Deliverables: The requirements document, developer's manual, and user manual. (Also maintain a list of problems/bugs that you encountered during this phase. You will need it in a later phase.)
Additional reading(s): The wiki documents at http://guitar.sourceforge.net/. The Subversion repository at http://guitar.sourceforge.net/.

Grading session: The documents need to be uploaded to the http://guitar.sourceforge.net/ web-site. They will be examined for correctness and completeness.

Points: 210

bulletTotal Points (210)
bulletMain Page of Tool (20)
bulletOverall "Look and Feel" (5)
bulletClear Description of tool or link to the description (10)
bulletLinks to user guide, developer guide, source code, and executable (5)
bulletRequirements Document (25)
bulletOverall "Look and Feel" (5)
bulletClear Description of tool or link to the description (5)
bulletScope of Document, intended audience (5)
bulletUser Requirements (5)
bulletDeveloper Requirements (5)
bulletUser Guide (65)
bulletOverall "Look and Feel" (5)
bulletClear Description of tool or link to the description (10)
bulletDownloading and Installing (5)
bulletHow to run the tool: step by step guide with an example application (35)
bulletTroubleshooting (10)
bulletDeveloper Guide (100)
bulletOverall "Look and Feel" (5)
bulletClear Description of tool or link to the description (2)
bulletCoding Conventions (2)
bulletGUITAR repository conventions (2)
bulletGetting the code for the tool (2)
bulletHow to build the tool (2)
bulletHow to make changes and submit the code (5)
bulletHow to test your changes (10)
bulletDeveloping a new plug-in for the tool (70)
bulletWhat is a plugin for this tool (5)
bulletDescription of an existing plugin (15)
bulletWhat are its parts (5)
bulletWhere is the source code (5)
bulletHow does it interact with the rest of the code; show a figure? (5)
bulletHow to write a new plugin (50)
bulletExplanation of new plugin: "what will it do?" (5)
bulletStep-by-step guide with full downloadable code of new plugin (45)
bulletSteps (25)
bulletOne-click downloadable new plugin with instructions on how to add it to GUITAR (20)

Phase 2 (to be done in a team); due date: Feb. 26, 2010. (ONLY FOR THE Windows Ripper/Replayer team.)

Goal: Understanding the requirements of your project and understanding the GUIRipper-Core and GUIReplayer-Core API.
Procedure: Schedule an interview (or multiple interviews) with the customer  to understand the requirements. Also talk to the developer of the GUIRipper-Core and GUIReplayer-Core to get an idea of the system architecture, coding conventions, data formats, and other implementation details.
Deliverables: The requirements document and implementation plan.
Additional reading(s): The wiki documents at http://guitar.sourceforge.net/. The Subversion repository at http://guitar.sourceforge.net/.

Grading session: The requirements document needs to be uploaded to the http://guitar.sourceforge.net/ web-site. It will be examined for correctness and completeness. The implementation plan needs to be handed to the instructor. This will be discussed during a meeting. Note that the implementation plan will include a schedule for coding each method in the GUIRipper-Core and GUIReplayer-Core API.

Points: 210

Phase 3 (to be done in a team); due date: Mar. 12, 2010. (NOTE: this phase does not apply to the Windows Ripper/Replayer team. See "Phase 3 and above" requirements below.)

Goal: Refactoring the existing test cases.
Procedure: All the modules used for your tool have a tests/ folder that contains unit tests. Additionally, the .tool folders contain system tests that perform system testing of the tool. Many of these tests "don't work" because of incorrect paths, library files, etc. Study these issues and update them. Ensure that the tests run correctly on the continuous test Hudson server.
Deliverables: Existing test cases that "work". (Also maintain a list of bugs that you encountered during this phase. You will need it in a later phase.)
Additional reading(s):

Grading session: The tests will be executed on the Hudson server and graded.

Points: 150

bullet

Total points (150)

bullet

Fully automatic successful execution of large system tests (5)

bullet

Fully automatic successful execution of small system tests (5)

bullet

Fully automatic successful execution of all integration tests (5)

bullet

Fully automatic successful execution of all unit tests (5)

bullet

Coverage report of large system tests (5)

bullet

Coverage report of small system tests (5)

bullet

Coverage report of all integration tests (5)

bullet

Coverage report of all unit tests (5)

bullet

JavaDoc reports of all integration tests (5)

bullet

JavaDoc reports of all unit tests (5)

bullet

Successful end-to-end execution and coverage report generation by Hudson of large system tests (25)

bullet

Successful end-to-end execution and coverage report generation by Hudson of small system tests (25)

bullet

Successful end-to-end execution, JavaDoc production, and coverage report generation by Hudson of all integration tests (25)

bullet

Successful end-to-end execution, JavaDoc production, and coverage report generation by Hudson of all unit tests (25)

Phases 3 and ABOVE (to be done in a team); due dates: According to the schedule in the implementation plan. (ONLY FOR THE Windows Ripper/Replayer team.)

Goal: Realizing the implementation plan. Coding may be done in C++ or C#.
Procedure: Implement the code according to the implementation plan submitted for Phase 2.
Deliverables: Source code, documentation in the source, unit tests, scripts to compile and build the project, and Hudson scripts.
Additional reading(s):

Grading session: The source will be compiled and executed on the Windows inputs.

Points: will be determined using the implementation plan. Maximum points 700.

We will now start using the flexible team (each is called a flex-team) style of project management. By now, from your experience in phases 2 and 3, you know the strengths/weaknesses of your teammates. Create your own flex-team, consisting of at least one member (i.e., you) and at most 6 members, for phases 4, 5, and 6. Flex-teams will be subsets of your original teams. If you are happy with your entire team from Phases 2 and 3, feel free to continue to work together as a flex-team. The advantage is that flex-teams can change per phase based on performance of individuals. You don't need to inform me about the members of your flex-team until submission time. This is used to simulate an Industry scenario in which you quickly put together a team of "consultants" based on their capabilities to perform a specific task.

(This change does not apply to the Windows Ripper/Replayer team.)

 

Phase 4 (graded individually); due date: Mar. 25, 2010. (NOTE: this phase does not apply to the Windows Ripper/Replayer team.)

Goal: Bug reporting of the existing tools.
Procedure: Use the list of problems and bugs from phases 1--3 and document them using the bug reporting tool.
Deliverables: Bug reports on sourceforge.net. Provide details of how the bug was discovered. And how it may be replicated.
Additional reading(s):

Grading session: The bug reports will be checked for completeness. Put your name on the bug report that you submit. If several of you worked together as a flex-team to find and report a bug, please put names of all involved. If needed, you can get new sourceforge IDs and use them to report the bugs. No action is needed from my end. This phase is being graded individually so that you have some flexibility with forming your flex-team. For completeness, your name should appear on at least five bug reports. Each report is worth 15 points, for a total of 15x5=75 points. If duplicate reports (i.e., reporting the same bug) are found, I will keep only the first one (they are all time-stamped) and delete the rest.

Points: 75

bullet

Bug reports in SourceForge.net, at least 5 reports (15 points per report)

 

Phase 5 (graded per flex-team); due date: Apr. 16, 2010. (NOTE: this phase does not apply to the Windows Ripper/Replayer team.)

Goal: Fix all the bugs and develop additional test cases.
Procedure: Ensure that the Hudson jobs associated with your tool and modules are running. Work with your instructor to make copies of Hudson jobs for your flex-team. Examine the Hudson reports for small, large, unit, and integration tests. Check why they are failing; fix all causes of failures. Add new unit and integration tests to get 100% statement and branch coverage for each type of test. JavaDoc all tests. Ensure that none of the Hudson jobs fail for any reason. You will work within your own trunk/tests-<your-flex-team-ID> for unit tests and integration-tests-<your-flex-team-ID> for integration tests folders.
Deliverables: The Hudson jobs for small, large, unit, and integration tests should succeed. The unit and integration tests should individually show 100% statement and branch coverage. Coverage reports and JavaDocs for all code and test cases.
Additional reading(s):

Grading session: We will run the Hudson jobs and check coverage. And check JavaDoc for completeness.

Points: 300

bullet

Successful end-to-end execution and coverage report generation by Hudson of large system tests. All tests should pass (25)

bullet

Successful end-to-end execution and coverage report generation by Hudson of small system tests. All tests should pass (25)

bullet

Successful end-to-end execution, JavaDoc production, and coverage report generation by Hudson of all integration tests.  All tests should pass (25)

bullet

Successful end-to-end execution, JavaDoc production, and coverage report generation by Hudson of all unit tests.  All tests should pass (25)

bullet

Percentage of statements covered by unit tests. All source files must be instrumented. As reported by the Hudson jobs. (100)

bullet

Percentage of statements covered by integration tests. All source files must be instrumented. As reported by the Hudson jobs. (100)

 

Phase 6 (graded per flex-team); due date: May. 7, 2010. (NOTE: this phase does not apply to the Windows Ripper/Replayer team.)

Goal: Incorporate new features into the overall system.
Procedure: New modules need to be incorporated into the TestCaseGenerator as plugins. See the two existing modules for examples. Each flex-team will discuss the requirements of a new module with the customer. Develop a requirements document, develop the code, use JavaDoc to document it, develop new unit, integration, and system tests, update/create Hudson scripts, and update/create the user and developer manuals.
Deliverables: The new code, revised documents, and uploaded materials on Hudson.
Additional reading(s):

Grading session:

Points: 175

bullet

revised requirements document (10)

bullet

updated user and developer manuals (15)

bullet

JavaDoc for new code. Reported via Hudson (25)

bullet

new unit tests with 100% statement coverage and JavaDoc. Successful execution via Hudson (25)

bullet

new integration tests with 100% statement coverage and JavaDoc. Successful execution via Hudson (25)

bullet

new system tests (5 small system tests and 5 large system tests) with coverage report. Successful execution via Hudson (75)

 

EXTRA CREDIT ASSIGNMENT. (Improving the wiki documentation)

bulletTotal Points (25)
bulletMain Page of Tool (15)
bulletOverall "Look and Feel" (5)
bulletClear Description of tool or link to the description (5)
bulletLinks to user guide, developer guide, source code, and executable (5)
bulletDeveloper Guide (10)
bulletHow to make changes and submit the code (5)
bulletHow to test the changes (5)

Back to Top

horizontal rule

Copyright: Dept. of Computer Science, University of Maryland.
For problems or questions regarding this web, contact Atif M. Memon.
Last updated: January 23, 2010.