Software Engineering

Spring 2005; CMSC 435; Section 0101

horizontal rule

Home
Course Information
Class Schedule
Course Readings
Project
TerpOffice
Bug Database
Quizzes
Atif M. Memon's Page
Send Atif an e-mail
qing@cs.umd.edu

Project

IMPORTANT: As part of this project, you will be expected to use buggy 3rd-party software. This is done deliberately to simulate real-life scenarios. Students are expected to work around the bugs and submit their projects on time.

Teams

I will partition the class into teams. The project manager is marked in red below.

Team

STUDENT NAME

 

Team 1: TerpSpreadsheet 

Spreadsheet program

with User-defined Functions

Altman, Michael Ross
Bates, Adam MacNeil
Boyce, Dorian E
Chen, Drew Truland
Chumpitazi, Anthony Christ

Team 2: TerpPaint

Paint Program

Cirujales, John Odsinada
Delacruz, Michael William
Horowitz, David Seth
Jeune, Vladimir Randy
Malhotra, Dave Kumar

Team 3: TerpPresent

 Object-based drawing and 

Presentation Tool

McFarlane, Dwayne A
Rasekh, Ari Daniel
Singh, Uttam
Tucker, David Nathaniel
Vela, Juan Miguel

 

Project Requirements

   You need to schedule meetings with me so that I can give you an initial set of informal requirements for your assigned software product. 

What you need to do

    Starting from the informal requirements, you will develop a complete set of requirements, design a system that meets these requirements, and finally create and test a software system that implements your design. At each step in this process you will produce corresponding documentation. All documents must be submitted in electronic format and must be written in English. You must also submit an evaluation of yourself and each of your team-mates at each of these stages.

Project Schedule

Phase 1: Inheriting the code. Due date: Feb. 21.

Summary: Ensuring that Version 4.0 works, identifying the bugs and missing parts, and updating the bug reports.

Details:

  1. Download the application source code.
  2. Examine the JavaDoc comments (each relevant field should be filled). Use doccheck to check the completeness of the JavaDoc.
  3. Compile/execute the application using build scripts. If unavailable, then please create them.
  4. Download all available JUnit test cases and execute them on an instrumented version of your application. Obtain failure and coverage reports. Use doccheck to check the completeness of the test case's JavaDoc.
  5. Install and use the application -- report bugs.

Deliverables

  1. Source Code. Output of doccheck.
  2. Build and execute scripts (plain-text files).
  3. Plain-text README file describing how to build and execute the software on any platform.

The source code submission should include only SOURCE files. Code archives (e.g., .jar) should not be submitted. The build scripts should be able to install the software on a new computer, which has only the OS installed. The grader will first delete all archives and derived files (e.g., .jar, .exe, .class files) from your submission; if any are found, you will lose 100 points. The grader will then read your README file (a plain-text file) and use it to build your application. The build script should return a meaningful error and instructions if additional system software (such as a compiler) needs to be installed. If the script does not return a meaningful message, then the grader will abandon the installation; you will lose all points for this submission. You are not allowed to require the installation of any third-party tools and/or libraries. A successful build is not worth any points. The grader will repeat this process on three platforms -- Mac OS X, Windows XP, and Linux. The final message from the build script (after a successful build) should give detailed instructions on how to execute your application. An "execute script" should be generated/provided that automatically executes the application. Without this script, the grader will not run your application.

  1. JUnit Test cases & their JavaDoc (Test Cases should have 95% Statement Coverage and 100% Method Coverage; Submit Coverage Reports)
  2. Printout of your Bug reports from Bugzilla (Bug reports should have description of Bug and the test case attachment that was used to detect it)

 

Phase 2: Requirements and User Scenarios. Due date: Mar. 1.

Summary: Requirement Analysis Document and Scenarios

Deliverables

  1. VORD templates 
  2. Viewpoint hierarchy

Deliverables

  1. Get scenarios from customer via interviews
  2. 100 Scenarios. The structure of each scenario is [initial software state, event sequence, final software state]. 
bullet25 sequences should be of length 1 or more. 
bullet25 sequences should be of length 10 or more. 
bullet25 sequences should be of length 20 or more. 
bullet15 sequences should be of length 25 or more. 
bullet10 sequences should be of length 30 or more. 

GRADING POLICY: This submission is worth 200 points. You are expected to submit (1) the VORD documents (100 points) and (2) valid LaTeX source files with 100 scenarios (100 points).

For the scenarios, the initial and final software states should be described using line-drawings of screenshots accompanied by text fully describing the state in detail. If any environment settings, such as variables and files are expected to be needed in the state description, please describe them in detail. The event sequence should be described in as much detail as possible. A software user should be able to use these scenarios to bring the software to the described initial state, execute the event sequence steps, and recreate the final state without any additional help. Please provide a file called "toplevel.tex" that compiles to the final document. The grader will compile the files using the command "pdflatex toplevel". If the LaTeX source files are not compliable, then you will lose all points for this submission. The grader will then check the 100 scenarios. You get 1 point for each correct scenario. 

 

Phase 3: Coding the new requirements. Due date: Apr. 2.

Summary: Complete Working Software Code (Version 5.0)

Deliverables

  1. Source Code.
  2. Build and execute scripts (plain-text files).
  3. Plain-text README file describing how to build and execute the software on any platform.

GRADING POLICY: This submission is worth 300 points. The source code submission should include only SOURCE files. Code archives (e.g., .jar) should not be submitted. The build scripts should be able to install the software on a new computer, which has only the OS installed. The grader will first delete all archives and derived files (e.g., .jar, .exe, .class files) from your submission; if any are found, you will lose 100 points. The grader will then read your README file (a plain-text file) and use it to build your application. The build script should return a meaningful error and instructions if additional system software (such as a compiler) needs to be installed. If the script does not return a meaningful message, then the grader will abandon the installation; you will lose all points for this submission. You are not allowed to require the installation of any third-party tools and/or libraries. A successful build is not worth any points. The grader will repeat this process on three platforms -- Mac OS X, Windows XP, and Linux. The final message from the build script (after a successful build) should give detailed instructions on how to execute your application. An "execute script" should be generated/provided that automatically executes the application. Without this script, the grader will not run your application.

The grader will then execute 100 scenarios (not necessarily the ones that you submit) taken directly from the requirements that were given by the customer. Each successful scenario per platform is worth 1 point.

 

Phase 4: Testing and Bug Reports. Due date: Apr. 24.

Summary: Testing, bug reporting, and Documentation (Version 5.0)

Deliverables (Choose 300 methods in the code for this phase; these methods should not have existing JUnit test cases.)

  1. JUnit Test cases & their JavaDoc (Test Cases should have 95% Statement Coverage and 100% Method Coverage); each method should have at least one unit test case. Also submit code faults, i.e., comments in the method's code of the form /*FAULT::   ....*/ that, if substituted for the current line, will cause the method's test case to fail. Provide one script to run all unit test cases fully automatically (NO HUMAN INTERVENTION) on the code. Provide a script to automatically insert the code faults, and run the test cases showing that all the test cases failed. Use doccheck to check the completeness of your JavaDoc. Provide a script to instrument the code and execute JUnit test cases automatically on the instrumented code. You can use any instrumenter. Once the JUnit test cases have been executed on the instrumented code, the script should generate a coverage report that summarizes the "cumulative" coverage of all covered methods (as one number) and per method.
  2. Source code with JavaDoc comments (each relevant field should be filled). Use doccheck to check the completeness of your JavaDoc.
  3. Printout of at least 100 bug reports from Bugzilla (Bug reports should have description of Bug and the test case attachment that was used to detect it). If the software does not have 100 bugs, then mention this "claim" in your submission. However, in this case, if the grader finds unreported bugs while executing the software, you will lose points.

GRADING POLICY:

  1. JUnit test cases per method: ((Number of test cases)/(Total number of methods)) * 20. Maximum points = 20.
  2. Coverage of test cases: (((Number of Statements Covered)/(Total Number of Statements)*100) / 95) * 20. Maximum points = 20.
  3. JavaDoc per test case: Formula based on "Executive Summary" output of doccheck. Maximum points = 20.
  4. JavaDoc per test case manual checking: 20 instances of JavaDoc will be selected and evaluated for completeness, consistency and correctness. Maximum points = 20.
  5. Code faults per test case: ((Number of failed test cases due to code faults)/(Total number of methods)) * 30. Maximum points = 30.
  6. Bug reports. ((Number of bug reports submitted)/100) * 20. Maximum points = 20.
  7. Scripts: You are expected to submit three scripts to: (1) run the JUnit test cases on your code, (2) insert code faults, and run test cases on the fault-inserted code, and (3) instrument the code, run JUnit test cases on the instrumented code and generate a "cumulative" and "per method" coverage report. The above evaluation points will be awarded only if the grader can execute the JUnit test cases, etc. using your scripts. Debugging a problematic script or running individual test cases is left to the discretion of the grader.
  8. JavaDoc per method: Formula based on  "Executive Summary" output of doccheck. Maximum points 20.
  9. JavaDoc per method manual checking: 20 instances of JavaDoc will be selected and evaluated for completeness, consistency and correctness. Maximum points 20.

 

Phase 5: Testing and Bug Reports. Due date: May. 11.

Summary: User Manuals (User Guide) and update Web-site for Version 5.0

GRADING POLICY:

WEB-SITE: [This submission is worth 200 points; bonus points will be given for creativity] The web-site should be submitted on a CD. It should have only one root folder at the top-level, called any one of: TerpManager, TerpWord, TerpPaint, TerpSpreadSheet, TerpCalc, TerpPresent. I will copy the root folder to the www.cs.umd.edu web server. Please don't hardcode any links, i.e., all links should be relative to your root folder.  Also, don't make any assumptions about the web-server (Note that cgi and other server scripts are not allowed; directory browsing is also not allowed). The root folder should contain a file called index.html that points to the "first page" (or Home) of your site. The contents of your entire site should be accessible (directly or indirectly) from this page. Provide at least the following pages: (1) "About Us" that lists your names, (2) "Contact Us" that points to your Terp????@cs.umd.edu contact e-mail address, (3) "Report a Bug" that points to all bugs related to your version of Terp???? on bugs.cs.umd.edu (yes, if you have reported bugs elsewhere, please migrate them to bugs.cs.umd.edu), (4) "FAQs" that contain answers to some commonly asked questions about your software/installer/developers; you may get some questions from the TA or your instructor, (5) "Install" that provides instructions (and a link to an archive containing only SOURCE files and install scripts) on how to install your application, (6) "Site Map" that shows the structure of your site as a tree, and (7) "Downloads" that allows users to download any (or all) document(s) and code submitted during this semester; provide mechanisms to let the user first select the needed files and then download them using one button-click. Each page should contain links to all these pages and Home. (Feel free to get feedback on partially created web-sites before the submission date. I strongly encourage that all groups work together to produce the web-sites; also, it would be nice to have one TerpOffice Version 4.0 page that links to all Terp???? applications in version 4.0)

USER-MANUAL: [This submission is worth 100 points; bonus points will be given for creativity] Submit ONE pdf file that is the user guide, and all the source files. The sources should produce one document identical to the submitted pdf file. The user guide should contain the following sections: Cover page, Table of contents, Introduction (overview of the software, platform restrictions, etc.), Installation guide, Working with Terp???? (features of your software, How-to use-cases submitted in an earlier phase).

 

Final Phase: Putting it all together. Due date: May. 18.

Summary: Complete Version 5.0. Please put everything that you have developed in this semester (including the power-point slides) on one CD.

Computing Resources

    The university computer labs should provide all necessary support for the project. Other resources normally available to you (e.g., home computers) can be employed, however you do this is "at your own risk." No alterations to conditions of the assignment will be made to accommodate peculiarities of your other computing resources.

Project Presentation

    All teams will present their project in class. Details of the presentation are available here. An evaluation sheet will need to be filled by every student.

"The dog ate my homework"

    Late deliverables will be accepted with a 10% penalty per day. Start your projects early - last-minute computer malfunctions will not be accepted as reason for delaying an assignment's due date.

 

Back to Top

horizontal rule

Copyright: Dept. of Computer Science, University of Maryland.
For problems or questions regarding this web, contact Atif M. Memon.
Last updated: January 12, 2006.