|
|
IMPORTANT: As part of this project, you will be expected to use buggy 3rd-party software. This is done deliberately to simulate real-life scenarios. Students are expected to work around the bugs and submit their projects on time. | |
| Late deliverables will be accepted with a 20% penalty per day. Start your projects early - last-minute computer malfunctions will not be accepted as reason for delaying an assignment's due date. | |
| All teams will present their project in class. | |
| The university computer labs should provide all necessary support for the project. Other resources normally available to you (e.g., home computers) can be employed, however you do this is "at your own risk." No alterations to conditions of the assignment will be made to accommodate peculiarities of your other computing resources. |
In Fall 2009, I had asked a software company to develop a system called GUITAR (http://guitar.sourceforge.net/), a system that was required to have six applications (JFCGUIRipper, GUIStructure2EFG, TestCaseGenerator, JFCGUIReplayer, WindowsGUIRipper, WindowsGUIReplayer). Because they did not use good software engineering practices, the company was unable to deliver a good product. Only the first four applications (JFCGUIRipper, GUIStructure2EFG, TestCaseGenerator, JFCGUIReplayer) were completed but without adequate documentation. The remaining two applications (WindowsGUIRipper, WindowsGUIReplayer) remained in prototype stage, in C#. Now I am hiring TerpSoft to fix some of the problems.
Goal: Downloading and running all the applications on a simple input.
Procedure: In this phase, you need to demonstrate that you are able to run all six applications (please see your course instructor for the inputs to these applications). The source code for all these applications resides in a Subversion repository that can be viewed at http://guitar.svn.sourceforge.net/viewvc/guitar/. Please refer to https://sourceforge.net/projects/guitar/develop for additional help with Subversion. Each application (lets call it foo for fun) resides in the folder foo.tool/ in the repository. Checkout this folder, for each application, and read its README.txt file. You will find more information about the modules that foo uses for building and execution. You will also find instructions on how to build and run foo.
Deliverables: There are no deliverables for this phase. Points will be awarded for demos. (But maintain a list of problems that you encountered with the tools during this phase. You will need it in a later phase.)
Additional reading(s): http://www.cs.umd.edu/~atif/papers/MemonSTVR2007-abstract.htmlGrading session: During the grading session, you will have access to a new machine (Windows or Linux) that is connected to the Internet. You will have to download, build, and execute all six applications and run them on all the inputs. You will also need to install Ant, Subversion, Java, C++, and any other tools needed to run the applications. Each application is worth 15 points.
Points: 90
| We will now start using fixed teams, where each member will be assigned to a particular team. Phases 2 and 3 will be graded based on the team. This is used to simulate an Industry scenario in which you are hired and assigned to a specific team. |
|
|
|
|
|
|
Abdelsalam, Mahmoud Mohame |
|
|
|
|
|
|
Abdullah, Mustafa |
|
|
|
|
|
|
Apau, Emmanuel Twum |
|
|
|
|
|
|
Bhatt, Arya |
|
|
|
|
|
|
Braun, Michael Richard |
|
|
|
|
|
|
Cartas, Anika Silvina |
|
|
|
|
|
|
Chitlur Lakshman, Deepak |
|
|
|
|
|
|
Craciunescu, Cosmin Octavi |
|
|
|
|
|
|
Dattilio, Patrick Christop |
|
|
|
|
|
|
de Castro, Fernando Corra |
|
|
|
|
|
|
Egan, Thomas Michael |
|
|
|
|
|
|
Goldin, Paul |
|
|
|
|
|
|
Greene, Jared Elliot |
|
|
|
|
|
|
Isaac, Joseph |
|
|
|
|
|
|
Jamison, Kasey Lynn |
|
|
|
|
|
|
Kaminski, Christopher Thom |
|
|
|
|
|
|
Kay, Jonathan Lee |
|
|
|
|
|
|
Khan, Kashif Maqsood |
|
|
|
|
|
|
Kraft, Gregory Kenneth |
|
|
|
|
|
|
Lam, Pui Ying |
|
|
|
|
|
|
Locke, John Bradford |
|
|
|
|
|
|
Lockyear, Alexander Garber |
|
|
|
|
|
|
Miluski, Matthew Scott |
|
|
|
|
|
|
Musavi, Omeed |
|
|
|
|
|
|
Muthara, Meria |
|
|
|
|
|
|
Naegele, Joseph Daniel |
|
|
|
|
|
|
Rau, Jonathan Frank |
|
|
|
|
|
|
Schoenfeld, Andrew N |
|
|
|
|
|
|
Sethi, Nidhi |
|
|
|
|
|
|
Smith, Bria Danielle |
|
|
|
|
|
|
Solomon, Joseph Michael |
|
|
|
|
|
|
Sternberg, Jonathan Alan |
|
|
|
|
|
|
Verner, Arseni |
|
|
|
|
|
|
Woo, Kevin J |
|
|
|
|
|
|
Yohannes, Mekias Mebrahtu |
|
|
|
|
|
|
Young, Jason |
|
C++ |
Ripper |
EFG |
TCG |
Replayer |
|
Goal: Understanding the requirements of your project and trying out more complex inputs.
Procedure: Schedule an interview (or multiple interviews) with the customer to understand the requirements. Also talk to the developer to get an idea of the system architecture, coding conventions, data formats, and other implementation details.
Deliverables: The requirements document, developer's manual, and user manual. (Also maintain a list of problems/bugs that you encountered during this phase. You will need it in a later phase.)
Additional reading(s): The wiki documents at http://guitar.sourceforge.net/. The Subversion repository at http://guitar.sourceforge.net/.Grading session: The documents need to be uploaded to the http://guitar.sourceforge.net/ web-site. They will be examined for correctness and completeness.
Points: 210
Total Points (210)
|
Goal: Understanding the requirements of your project and understanding the GUIRipper-Core and GUIReplayer-Core API.
Procedure: Schedule an interview (or multiple interviews) with the customer to understand the requirements. Also talk to the developer of the GUIRipper-Core and GUIReplayer-Core to get an idea of the system architecture, coding conventions, data formats, and other implementation details.
Deliverables: The requirements document and implementation plan.
Additional reading(s): The wiki documents at http://guitar.sourceforge.net/. The Subversion repository at http://guitar.sourceforge.net/.Grading session: The requirements document needs to be uploaded to the http://guitar.sourceforge.net/ web-site. It will be examined for correctness and completeness. The implementation plan needs to be handed to the instructor. This will be discussed during a meeting. Note that the implementation plan will include a schedule for coding each method in the GUIRipper-Core and GUIReplayer-Core API.
Points: 210
Goal: Refactoring the existing test cases.
Procedure: All the modules used for your tool have a tests/ folder that contains unit tests. Additionally, the .tool folders contain system tests that perform system testing of the tool. Many of these tests "don't work" because of incorrect paths, library files, etc. Study these issues and update them. Ensure that the tests run correctly on the continuous test Hudson server.
Deliverables: Existing test cases that "work". (Also maintain a list of bugs that you encountered during this phase. You will need it in a later phase.)
Additional reading(s):Grading session: The tests will be executed on the Hudson server and graded.
Points: 150
|
Total points (150)
|
Goal: Realizing the implementation plan. Coding may be done in C++ or C#.
Procedure: Implement the code according to the implementation plan submitted for Phase 2.
Deliverables: Source code, documentation in the source, unit tests, scripts to compile and build the project, and Hudson scripts.
Additional reading(s):Grading session: The source will be compiled and executed on the Windows inputs.
Points: will be determined using the implementation plan. Maximum points 700.
| We will now start using the flexible team (each is called a
flex-team) style of project management. By now, from your experience
in phases 2 and 3, you know the strengths/weaknesses of your teammates.
Create your own flex-team, consisting of at least one member
(i.e., you) and at most 6 members, for phases 4, 5, and 6.
Flex-teams
will be subsets of your original teams. If you are happy
with your entire team from Phases 2 and 3, feel free to continue to work
together as a flex-team. The advantage is that flex-teams can change per
phase based on performance of individuals. You don't need to inform me
about the members of your flex-team until submission time. This is used
to simulate an Industry scenario in which you quickly put together a
team of "consultants" based on their capabilities to perform a specific
task. (This change does not apply to the Windows Ripper/Replayer team.) |
Goal: Bug reporting of the existing tools.
Procedure: Use the list of problems and bugs from phases 1--3 and document them using the bug reporting tool.
Deliverables: Bug reports on sourceforge.net. Provide details of how the bug was discovered. And how it may be replicated.
Additional reading(s):Grading session: The bug reports will be checked for completeness. Put your name on the bug report that you submit. If several of you worked together as a flex-team to find and report a bug, please put names of all involved. If needed, you can get new sourceforge IDs and use them to report the bugs. No action is needed from my end. This phase is being graded individually so that you have some flexibility with forming your flex-team. For completeness, your name should appear on at least five bug reports. Each report is worth 15 points, for a total of 15x5=75 points. If duplicate reports (i.e., reporting the same bug) are found, I will keep only the first one (they are all time-stamped) and delete the rest.
Points: 75
|
Bug reports in SourceForge.net, at least 5 reports (15 points per report) |
Goal: Fix all the bugs and develop additional test cases.
Procedure: Ensure that the Hudson jobs associated with your tool and modules are running. Work with your instructor to make copies of Hudson jobs for your flex-team. Examine the Hudson reports for small, large, unit, and integration tests. Check why they are failing; fix all causes of failures. Add new unit and integration tests to get 100% statement and branch coverage for each type of test. JavaDoc all tests. Ensure that none of the Hudson jobs fail for any reason. You will work within your own trunk/tests-<your-flex-team-ID> for unit tests and integration-tests-<your-flex-team-ID> for integration tests folders.
Deliverables: The Hudson jobs for small, large, unit, and integration tests should succeed. The unit and integration tests should individually show 100% statement and branch coverage. Coverage reports and JavaDocs for all code and test cases.
Additional reading(s):Grading session: We will run the Hudson jobs and check coverage. And check JavaDoc for completeness.
Points: 300
|
Successful end-to-end execution and coverage report generation by Hudson of large system tests. All tests should pass (25) | |
|
Successful end-to-end execution and coverage report generation by Hudson of small system tests. All tests should pass (25) | |
|
Successful end-to-end execution, JavaDoc production, and coverage report generation by Hudson of all integration tests. All tests should pass (25) | |
|
Successful end-to-end execution, JavaDoc production, and coverage report generation by Hudson of all unit tests. All tests should pass (25) | |
|
Percentage of statements covered by unit tests. All source files must be instrumented. As reported by the Hudson jobs. (100) | |
|
Percentage of statements covered by integration tests. All source files must be instrumented. As reported by the Hudson jobs. (100) |
Goal: Incorporate new features into the overall system.
Procedure: New modules need to be incorporated into the TestCaseGenerator as plugins. See the two existing modules for examples. Each flex-team will discuss the requirements of a new module with the customer. Develop a requirements document, develop the code, use JavaDoc to document it, develop new unit, integration, and system tests, update/create Hudson scripts, and update/create the user and developer manuals.
Deliverables: The new code, revised documents, and uploaded materials on Hudson.
Additional reading(s):Grading session:
Points: 175
|
revised requirements document (10) | |
|
updated user and developer manuals (15) | |
|
JavaDoc for new code. Reported via Hudson (25) | |
|
new unit tests with 100% statement coverage and JavaDoc. Successful execution via Hudson (25) | |
|
new integration tests with 100% statement coverage and JavaDoc. Successful execution via Hudson (25) | |
|
new system tests (5 small system tests and 5 large system tests) with coverage report. Successful execution via Hudson (75) |
Total Points (25)
|
Copyright: Dept. of Computer Science, University of Maryland.
|