Software Engineering

Spring 2005; CMSC 435; Section 0201

horizontal rule

Home
Course Information
Class Schedule
Course Readings
Project
TerpOffice
Bug Database
Quizzes
Atif M. Memon's Page
Send Atif an e-mail
raaghav@cs.umd.edu

Project

Teams

    I will partition the class into teams of approximately six students. Each team will consist of a testing and verification group (group 1) of three/four students and a coding group (group 2) of three/four students. The following bullets summarize the duties of these groups:

bulletProject Manager
bulletTesting and Verification Group (Group 1)
bulletGUITAR Tests for Latest Version
bulletUpdate bug-reports for Latest Version
bulletVORD Documents
bulletScenario Creation for New Version with customer
bulletPreconditions/postconditions
bulletGUITAR Tests for New Version
bulletUpdate bug-reports for New Version
bulletUser Manual Creation and Update website for New Version
bulletCoding Group (Group 2)
bulletJavaDoc Files Creation for New Version
bulletJUnit Test Cases for New Version
bulletUpdate bug-reports for New Version
bulletVORD Documents
bulletImplementation for New Version
bulletJavaDoc Files Creation for New Version
bulletJUnit Test Cases for New Version
bulletFinal Installable (one file) Software Creation for New Version

IMPORTANT: As part of this project, you will be expected to use buggy 3rd-party software. This is done deliberately to simulate real-life scenarios. Students are expected to work around the bugs and submit their projects on time.

The project manager (marked in red below) will be a member of one group.

Team

Group type

STUDENT NAME

Team 1: TerpCalc

Graph Calculator with

Filing capability &

User-defined Functions

Testing

Atamas, Nicholas
Abbas, Sumair
Nguyen, Thanh Dang

Coding

Lin, Michael Y
Rozi, Faraz
Rho, Yong Doo

Team 2: TerpWord

Word Processor with 

Graph/image display capability

Testing

Kang, Kenneth
Zuckerman, Jay
Ibrahim, Ahmad Hassan
Padencov, Serban Nicolae

Coding

Miller, Kenneth Todd
Wolfe, Mark Alan
Mansoor, Muhammad Salman

Team 3: TerpSpreadsheet 

Spreadsheet program

with User-defined Functions

Testing

Dharia, Harshal Jayesh
Tahan, Jonathan David
Wheeler, Scott Philip

Coding

Kim, Tae Hoon
Chow, Daniel Kenneth
Thaler, Evan Scott
Ma, Xiao

Team 4: TerpPaint

Paint Program

Testing

Tamizi, Matin
Ziskind, David Ryan
Rabinovich, Svetlana

Coding

Wissmann, Robert William
Cheng, Mo
Tao, Shih-Chieh

Team 5: TerpPresent

 Object-based drawing and 

Presentation Tool

Testing

Zhu, Tommy M
Wong, Eric Tak Wai
Kim, Sun Ki
Lowe, Walter Payne

Coding

Ryan, Joseph Francis
Rash, Kathryn R
Ellison, Jason Jeffrey

Team 6: TerpManager

An Explorer

Testing

Ambikapathi, Girija
Garrett, Lindsey Anne
Lee, Hana

Coding

Lee, Joseph Hansung
Axelrod, Adam Marc

 

Project Requirements

   You need to schedule meetings with me so that I can give you an initial set of informal requirements for your assigned software product. 

What you need to do

    Starting from the informal requirements, you will develop a complete set of requirements, design a system that meets these requirements, and finally create and test a software system that implements your design. At each step in this process you will produce corresponding documentation. All documents must be submitted in electronic format and must be written in English. You must also submit an evaluation of yourself and each of your team-mates at each of these stages.

Project Schedule

#

Group

Phase

Due Date

1 Testing Summary: GUITAR Tests & Update bug-reports for Version 3.0

Deliverables

  1. GUITAR Test cases

    bullet

    Structural 2000 test cases (length 1 or more) + Coverage report (Method - 100% & Statement - 95%)

    bullet

    Manual 900 test cases (length 20 or more)  + Coverage report (Method - 100% & Statement - 95%)

    bullet

    Random 1000 test cases (length 30 or more)  + Coverage report

    bullet

    PDDL reachability test cases + Coverage report

  2. Successful/unsuccessful log files (automatically generated by GUITAR).

  3. Printout of your Bug reports from Bugzilla (Bug reports should have description of Bug and the test case attachment that was used to detect it)

Feb. 22
2 Coding Summary: JavaDoc Files Creation, Unit Test Cases & Update bug-reports for Version 3.0

Deliverables

  1. Source code with JavaDoc comments (each relevant field should be filled)
  2. JavaDoc HTML output
  3. JUnit Test cases & their JavaDoc (Test Cases should have 95% Statement Coverage and 100% Method Coverage; Submit Coverage Reports)
  4. Printout of your Bug reports from Bugzilla (Bug reports should have description of Bug and the test case attachment that was used to detect it)
Feb. 22
3 Testing & Coding Summary: Requirement Analysis Document

Deliverables

  1. VORD templates 
  2. Viewpoint hierarchy
Mar. 7
4 Testing Summary: Design Document

Deliverables

  1. Get scenarios from customer via interviews
  2. 100 Scenarios. The structure of each scenario is [initial software state, event sequence, final software state]. 
bullet25 sequences should be of length 1 or more. 
bullet25 sequences should be of length 10 or more. 
bullet25 sequences should be of length 20 or more. 
bullet15 sequences should be of length 25 or more. 
bullet10 sequences should be of length 30 or more. 

GRADING POLICY: This submission is worth 100 points. You are expected to submit valid LaTeX source files with 100 scenarios. The initial and final software states should be described using captured screenshots accompanied by text fully describing the state in detail. If any environment settings, such as variables and files are needed in the state description, please describe them in detail. The event sequence should be described in as much detail as possible. A software user should be able to use these scenarios to bring the software to the described initial state, execute the event sequence steps, and recreate the final state without any additional help. Please provide a file called "toplevel.tex" that compiles to the final document. The grader will compile the files using the command "pdflatex toplevel". If the LaTeX source files are not compliable, then you will lose all points for this submission. The grader will then execute the 100 scenarios on three platforms -- Mac OS X, Windows XP, and Linux. For each scenario that runs successfully on all three platforms, you get 1 point. No partial points will be given if results vary by platform.

Apr. 4
5 Coding Summary: Complete Working Software Code (Version 4.0)

Deliverables

  1. Source Code.
  2. Build and execute scripts (plain-text files).
  3. Plain-text README file describing how to build and execute the software on any platform.

GRADING POLICY: This submission is worth 300 points. The source code submission should include only SOURCE files. Code archives (e.g., .jar) should not be submitted. The build scripts should be able to install the software on a new computer, which has only the OS installed. The grader will first delete all archives and derived files (e.g., .jar, .exe, .class files) from your submission; if any are found, you will lose 100 points. The grader will then read your README file (a plain-text file) and use it to build your application. The build script should return a meaningful error and instructions if additional system software (such as a compiler) needs to be installed. If the script does not return a meaningful message, then the grader will abandon the installation; you will lose all points for this submission. You are not allowed to require the installation of any third-party tools and/or libraries. A successful build is not worth any points. The grader will repeat this process on three platforms -- Mac OS X, Windows XP, and Linux. The final message from the build script (after a successful build) should give detailed instructions on how to execute your application. An "execute script" should be generated/provided that automatically executes the application. Without this script, the grader will not run your application.

The grader will then execute 100 scenarios (not necessarily the ones that you submit) taken directly from the requirements that were given by the customer. Each successful scenario per platform is worth 1 point.

Apr. 4
6 Testing Summary: Test cases & Update bug-reports for Version 4.0

Deliverables (Choose 300 methods in the code for this phase)

  1. JUnit Test cases & their JavaDoc (Test Cases should have 95% Statement Coverage and 100% Method Coverage); each method should have at least one unit test case. Also submit code faults, i.e., comments in the method's code of the form /*FAULT::   ....*/ that, if substituted for the current line, will cause the method's test case to fail. Provide one script to run all unit test cases fully automatically on the code. Provide a script to automatically insert the code faults, and run the test cases showing that all the test cases failed. Use doccheck to check the completeness of your JavaDoc. Provide a script to instrument the code and execute JUnit test cases automatically on the instrumented code. You can use any instrumenter. Once the JUnit test cases have been executed on the instrumented code, the script should generate a coverage report that summarizes the "cumulative" coverage of all covered methods (as one number) and per method.
  2. Printout of at least 100 bug reports from Bugzilla (Bug reports should have description of Bug and the test case attachment that was used to detect it). If the software does not have 100 bugs, then mention this "claim" in your submission. However, in this case, if the grader finds unreported bugs while executing the software, you will lose points.

GRADING POLICY:

  1. JUnit test cases per method: ((Number of test cases)/(Total number of methods)) * 20. Maximum points = 20.
  2. Coverage of test cases: (((Number of Statements Covered)/(Total Number of Statements)*100) / 95) * 20. Maximum points = 20.
  3. JavaDoc per test case: Formula based on "Executive Summary" output of doccheck. Maximum points = 20.
  4. JavaDoc manual checking: 20 instances of JavaDoc will be selected and evaluated for completeness, consistency and correctness. Maximum points = 20.
  5. Code faults per test case: ((Number of failed test cases due to code faults)/(Total number of methods)) * 30. Maximum points = 30.
  6. Bug reports. ((Number of bug reports submitted)/100) * 20. Maximum points = 20.
  7. Scripts: You are expected to submit three scripts to: (1) run the JUnit test cases on your code, (2) insert code faults, and run test cases on the fault-inserted code, and (3) instrument the code, run JUnit test cases on the instrumented code and generate a "cumulative" and "per method" coverage report. The above evaluation points will be awarded only if the grader can execute the JUnit test cases, etc. using your scripts. Debugging a problematic script or running individual test cases is left to the discretion of the grader.
Apr. 21
7 Coding Summary: JavaDoc Files Creation, Unit Test Cases & getIcon() for Version 4.0

Deliverables (Use the same (as above) 300 methods in the code for this phase)

  1. Source code with JavaDoc comments (each relevant field should be filled). Use doccheck to check the completeness of your JavaDoc.
  2. Code for the getIcon() method. Please get the requirements from the TerpManager group. (Informally, the method should take a file-name (of your application type) as a parameter and return a jpeg or gif image as an output. The image should be a thumbnail of the file, generated dynamically, showing a miniature image (icon size) of the file. Please get the exact requirements from the TerpManager group.)

GRADING POLICY:

  1. JavaDoc per test case: Formula based on  "Executive Summary" output of doccheck. Maximum points 20.
  2. JavaDoc manual checking: 20 instances of JavaDoc will be selected and evaluated for completeness, consistency and correctness. Maximum points 20.
  3. Working version of getIcon(). Maximum points 30 (no partial points for this step).
Apr. 21
8 Testing Summary: User Manuals (User Guide) and update Web-site for Version 4.0

GRADING POLICY:

WEB-SITE: [This submission is worth 200 points; bonus points will be given for creativity] The web-site should be submitted on a CD. It should have only one root folder at the top-level, called any one of: TerpManager, TerpWord, TerpPaint, TerpSpreadSheet, TerpCalc, TerpPresent. I will copy the root folder to the www.cs.umd.edu web server. Please don't hardcode any links, i.e., all links should be relative to your root folder.  Also, don't make any assumptions about the web-server (Note that cgi and other server scripts are not allowed; directory browsing is also not allowed). The root folder should contain a file called index.html that points to the "first page" (or Home) of your site. The contents of your entire site should be accessible (directly or indirectly) from this page. Provide at least the following pages: (1) "About Us" that lists your names, (2) "Contact Us" that points to your Terp????@cs.umd.edu contact e-mail address, (3) "Report a Bug" that points to all bugs related to your version of Terp???? on bugs.cs.umd.edu (yes, if you have reported bugs elsewhere, please migrate them to bugs.cs.umd.edu), (4) "FAQs" that contain answers to some commonly asked questions about your software/installer/developers; you may get some questions from the TA or your instructor, (5) "Install" that provides instructions (and a link to an archive containing only SOURCE files and install scripts) on how to install your application, (6) "Site Map" that shows the structure of your site as a tree, and (7) "Downloads" that allows users to download any (or all) document(s) and code submitted during this semester; provide mechanisms to let the user first select the needed files and then download them using one button-click. Each page should contain links to all these pages and Home. (Feel free to get feedback on partially created web-sites before the submission date. I strongly encourage that all groups work together to produce the web-sites; also, it would be nice to have one TerpOffice Version 4.0 page that links to all Terp???? applications in version 4.0)

USER-MANUAL: [This submission is worth 100 points; bonus points will be given for creativity] Submit ONE pdf file that is the user guide, and all the source files. The sources should produce one document identical to the submitted pdf file. The user guide should contain the following sections: Cover page, Table of contents, Introduction (overview of the software, platform restrictions, etc.), Installation guide, Working with Terp???? (features of your software, How-to use-cases submitted in an earlier phase).

In addition, some of you will submit a UML document that will be called the "Design Document". Please keep in mind that students will edit this document next year; if necessary, please provide links to (free) software that can be used to edit your files. Please don't use any commercial software.

May 09
9 Coding Summary: Debugged Final Deliverable (one-click installable) Code for Version 4.0

GRADING POLICY: This submission is worth 300 points. The grader will use the "Install" page of your web-site to download and build your application. If the download/build script/process fails, then the grader will abandon the installation; you will lose all points for this submission. You are not allowed to require the installation of any third-party tools and/or libraries. A successful build is not worth any points. The grader will repeat this process on three client platforms -- Mac OS X, Windows XP, and Linux. The final message from the build script (after a successful build) should give detailed instructions on how to execute your application. An "execute script" should be generated/provided that automatically executes the application. Without this script, the grader will not run your application.

The grader will then execute 100 scenarios (not necessarily the ones that you submit) taken directly from the requirements that were given by the customer. Each successful scenario per platform is worth 1 point. Note that several scenarios may fail due to one underlying bug; you will lose points for each failed scenario.
Also note that an action/event may not make sense in some contexts; you may handle contexts in several ways: (1) disable the event in that context or (2) popup a dialog saying that the event is not allowed in the context. The incorrect way is to throw an exception, without a meaningful message. A scenario that yields an uncaught or improperly handled exception will be considered "failed".

May 09

Computing Resources

    The university computer labs should provide all necessary support for the project. Other resources normally available to you (e.g., home computers) can be employed, however you do this is "at your own risk." No alterations to conditions of the assignment will be made to accommodate peculiarities of your other computing resources.

Project Presentation

    All teams will present their project in class. Details of the presentation are available here. An evaluation sheet will need to be filled by every student.

"The dog ate my homework"

    Late deliverables will be accepted with a 10% penalty per day. Start your projects early - last-minute computer malfunctions will not be accepted as reason for delaying an assignment's due date.

 

Back to Top

horizontal rule

Copyright: Dept. of Computer Science, University of Maryland.
For problems or questions regarding this web, contact Atif M. Memon.
Last updated: January 06, 2005.