CAR-TR-672
CS-TR-3069
May 1993
ABSTRACT
The AT&T Teaching Theater is a highly interactive, multimedia
electronic classroom at the University of Maryland offering instructors
many new and creative teaching opportunities. Although this technology
may hold many exciting possibilities, it is important to not lose
sight of the main objective of any teaching facility - the students.
Therefore, the important questions are: "How do students
rate the AT&T Teaching Theater? What are their opinions of
the various types of software programs currently offered? Do they
facilitate or interfere with the learning process?" This
paper discusses the results from a survey of students who attended
classes in the AT&T Teaching Theater, Fall semester, 1992.
A comparison among the different types of software used by the
various instructors is the focus for this evaluation. In particular,
HyperCourseware, a program providing an "electronic infrastructure"
for computer based education will be at the center of this comparison.
HyperCourseware is a "work in progress" and is one of
the few software packages used in the electronic classroom designed
with the Teaching Theater in mind. The findings from this paper
will be used to determine where improvements need to be made in
order to benefit the students and to make the most of the technology
offered in the AT&T Teaching Theater in the future.
INTRODUCTION
Creating better environments for students is a priority for most of today's educators. As we move into the future, it will become increasingly important for these educators to develop new methods for challenging and stimulating students academically . Technology provides a means for creating just such an atmosphere . By bringing together education and technology, the electronic classroom, attempts to offer a first step in the solution. An electronic classroom is a special room which provides computers for the students, networked and linked up with an instructor's computer. The electronic classroom offers students a certain potential of control over the learning process, as well as the ability to give input and feedback to the instructors. With the aid and convenience of technology, instructors now have the opportunity to provide more creative, flexible programs and to stimulate the students through a variety of media.
However, with such opportunities for advancement come problems.
Since the electronic classroom is based on a computer network,
all of the processing limitations often found in new systems,
such as speed and reliability must be dealt with. In addition,
for the classroom to really be used to potential, the instructors,
currently, must spend more time in preparation for each lesson.
What then, is the net loss or gain to the students? Do these problems
interfere significantly with the learning process, or are they
merely minor hassles which will eventually be eliminated and will
allow the student to fully enjoy the benefits of the electronic
classroom?
This paper uses as it's model, the AT&T Teaching Theater at
the University of Maryland at College Park, Maryland. Previous
research has evaluated the Teaching Theater in terms of its physical
configuration and usability (Norman & Lindwarm, 1993; Norman
& Carter, 1992). In this study, a comparison among the various
software products currently used in the Teaching Theater will
be made, with a specific focus on HyperCourseware, a specialized
software package, developed primarily for the Teaching Theater.
The Classroom
The layout of the AT&T Teaching Theater is a lecture-style
setup with the instructor at the head of the classroom facing
the students. Figure 1 shows a schematic for this classroom at
the University of Maryland. The instructor's desk contains the
equipment necessary to operate all of the media in the classroom
and to interface with the students' computers. The media used
in the classroom includes: two 4'x 6' high resolution rear projection
screens, a video overhead projector, a VCR, a video disc player
and a CD player. In addition, the instructor has access to all
of the students' machines, and using a video switcher, may display
the contents of any student's screen on the instructor's monitor,
which may in turn may be displayed on the front screen and/or
on the other students' monitors. Finally the instructor has the
capability to "take over" a student's display, to assist
or make corrections.
The students have access to 20 personal computer workstations
(each workstation can serve two students - allowing for a total
of 40 students per class). Each workstation is equipped with a
keyboard, a mouse and a 17" high resolution color monitor,
which is recessed into the desk to allow for good sight lines
and to conserve space. The computers may be used to present class
information (overheads, lecture notes, reading material), as scratch
pads, for note-taking, or for computing purposes (programming,
word processing, and networking).
The supporting computer systems are AT&T 25 MHz 386-based
units. The computers are linked through an AT&T Starlan
network, then through a Novell server and are eventually
linked to the Internet, thus allowing students outside access
to their accounts. For noise reduction, as well as comfort, the
AT&T Teaching Theater is carpeted and the computer units are
housed in an adjacent room. Environmental control for lighting
is controlled by the control panel at the instructor's workstation.
Software
A major focus of this paper is the comparison among the many types
of software packages used in the teaching theater. Table 1 in
the Appendix shows the courses and their associated software packages
used in the Teaching Theater for Fall semester 1992.
The following descriptions of the software products used in the
Teaching Theater will give some insight into the breadth and variety
of software used in the Teaching Theater.
Chat: A Novell utility that allows the users to "talk" interactively on-line with each other. Chat is a groupware product which allows and encourages user interaction.
WordPerfect: A stand alone, DOS-based word processor with automatic referencing, document comparison, dictionary and thesaurus, mouse support, mail merge, macros, table of contents generator, and text-integrated graphics.
Quattro Pro: A spreadsheet that allows consolidation of data from
more than one spreadsheet into a single spreadsheet, graph or
chart. Quattro Pro has extensive spreadsheet publishing and graphics
features for use in presentations, and a built-in draw program
that allows you to edit you charts and graphs. Quattro Pro is
primarily a stand alone product.
SAS: A single user statistical package designed for data analysis
and report generation.
GEDS: The Global Events Database Editor was used in a Government/Politics class on "Conflict and Peace Analysis". It is a single user product.
Visionquest: Marketed as a group decision support system (GDSS), VisionQuest offers a collaborative environment for sharing of ideas anonymously. Various tools in this system, especially brainwriting and comment cards, can be used in the academic setting for supporting a collaborative learning environment.
HyperCourseware: "A system of interlocking programs and files that serves as an electronic infrastructure by creating tokens on a computer network that represent the familiar objects of instruction such as the syllabus, the class roll, lecture notes, exams, and grade lists." (Norman & Lindwarm, 1993)
Paradox: A relational database manager that allows the user to
access and manipulate data in many ways. The user can perform
calculations, create graphs, and regroup information.
Most of these products are "Off-the-shelf" and were
therefore, not designed specifically for the Teaching Theater.
They have the advantage, though, of having gone through extensive
product development and rigorous testing. In addition, most of
these products work mainly as "stand alone" programs,
that is they are intended for use by a single individual. A few
of these products are actually intended for communication and
decision-making purposes, and are used as such in the Teaching
Theater. HyperCourseware, however, is the only product designed
for the specific requirements and goals of the Teaching Theater
by combining teacher-directed instruction with student input and
the capability for interaction. Therefore, the development of
HyperCourseware will be a primary focus of this paper. Issues
concerning current usability problems will be addressed so as
to better understand how this product can be improved for future
use in the Teaching Theater.
HyperCourseware
Described as an "electronic infrastructure", HyperCourseware
creates an environment in which the instructor can create lectures
and discussion platforms, interactive experiments, surveys, quizzes
and much more. In addition, the students and instructor can view
the current day's seating chart, notes from previous classes,
mail messages from the instructor or other classmates and can
look ahead to upcoming classes by browsing the course syllabus.
HyperCourseware attempts to utilize what is best about the Teaching
Theater - the ability to combine lectures and student participation
in one package.
METHOD
Subjects
Thirteen classes were taught in the AT&T Teaching Theater
in a variety of subjects ranging from English to Computer Science
(see Table 1 in the Appendix), allowing for a varying degree of
prior computer experience and course content. Each class had up
to 40 students enrolled. In addition, the course offering ranged
from undergraduate level to graduate level. Of the thirteen classes,
eight participated in the survey, and five of the classes administered
the survey on-line. The remaining three classes were administered
as paper and pencil questionnaires. In addition, one course with
a small enrollment was not counted in the final calculations as
only one student participated in the survey. Participation was
voluntary and the students were told that their answers and remarks
would not have any effect on their grade in the course.
QUIS
The Questionnaire for User Interaction Satisfaction version 5.5
is a general test for usability which measures a user's subjective
satisfaction with a particular interface (Chin, Diehl, & Norman,
1988; Harper & Norman, 1993). Questions used in the QUIS range
from general interface satisfaction (i.e. Overall reaction to
the system) to more detailed interface questions (i.e. rating
the helpfulness of error messages). In addition, the QUIS has
the capability of being tailored to the specific application being
rated (i.e. "Were technologies in the classroom used to full
potential?") The layout of the QUIS for this survey was as
follows:
PARTS 1& 2: System and demographic information.
PART 3: Overall User Reactions (6 questions)
PART 4: Screen (4 questions)
PART 5: Terminology and System Information (6 questions)
PART 6: Learning (6 questions)
PART 7: System Capabilities (5 questions)
PART 8: Media Effect (4 questions)
PART 9: Technology (4 questions)
PART 10: Accessibility (1 question)
Each of the questions in parts 3-10 consisted of a 9 point (1-9)
scale with an option of n/a for those questions which were not
applicable.
Evaluation Procedure
During the last week of the semester, all of the students attending
the Fall 1992 semester classes were administered the QUIS either
on-line or in paper form. As noted in the previous section, students
were asked to evaluate the software and other media in the AT&T
Teaching Theater. Data were grouped according to combinations
of Software used (see Table 1 in the Appendix). In some cases,
two software products were grouped together for evaluation (such
as WordPerfect with Chat, and Quattro Pro with SAS) since they
were used together in a particular course. Unfortunately, since
the survey did not distinguish between the two products the results
cannot be clearly differentiated.
RESULTS
Table 2 lists the means for satisfaction on the overall reactions
to the Teaching Theater. Since these values are highly correlated,
a total score was calculated for each type of software. Table
3 gives mean satisfaction ratings for the more detailed information
and Table 4 shows mean responses for AT&T-specific questions.
Within each table the boldfaced items indicate significant differences
among the products. The values, themselves, which are highlighted,
represent specific significant contrasts. From these data, it
is evident that overall, Quattro Pro had higher ratings as compared
to HyperCourseware and Visionquest. In particular, questions concerning
overall reaction (p<0.01), overall stimulation (p<0.05)
and overall power (p<0.05) of the system, showed significant
differences depending on the software used. This difference is
also reflected in the Overall Totals score (p<0.01). For overall
reaction, Quattro Pro and GEDS came out on top and HyperCourseware
and Visionquest rated significantly lower. Additionally, Quattro
Pro rated highly on questions involving system speed (p<0.01),
reliability (p<0.01), (limited) media interference (p<0.05),
and media helpfulness (p<0.05). Other areas which showed significant
differences among the products include: how often the computer
informs the user about what it is doing (p<0.05), computer
helpfulness (p<0.05) and adequacy of learning time (p<0.05).
Quattro Pro and GEDS were rated significantly higher than Visionquest
and Paradox on the question of how much improvement the student
experienced with the system (p<0.01). HyperCourseware also
rated well with respect to Paradox on this question (p<0.05).
Although these data may at first look somewhat discouraging with
respect to HyperCourseware, it is important to note that almost
all of the mean scores for each set of data fall above the benchmark
score of 5 on the scale. Graph 1, in the appendix, shows a scatterplot
of the overall mean for the data with one standard deviation above
and below the mean designated by the accompanying bars. It is
interesting to note some of the outlying points on this scatterplot.
Questions 4.1 (regarding readability of characters on the computer
screen) and 7.3 (Noise of the system) both rated quite high (favorably)
with overall means of 7.84 and 8.15 respectively. Conversely,
questions 5.6 (Helpfulness of error messages) and 10.1 (Accessibility
of the system from the Workstations at Maryland (WAM) labs) rated
considerably poorer than the rest of the scores with overall means
of 6.04 and 4.73. Similar results were shown in Graph 2 which
gives the same information exclusively for HyperCourseware scores.
Overall, the data reflects a rating in subjective user satisfaction
well over the benchmark score of 5 on the 9-point scale, with
the same issues noted as noticeably divergent scores.
DISCUSSION
As can be seen from the data collected, there are many discrepancies in student satisfaction with regards to the types of software used the Teaching Theater. On the one hand, the variety of software products demonstrates the great flexibility and diversity which the Teaching Theater has to offer. On the other hand, the great degree of variability among the different products makes them difficult to reliably compare. However, the purpose of this survey is to get a "feel" for how the students are responding to the Teaching Theater - whether the Teaching Theater provides a positive experience for the student
(which it apparently does) and better understand some of its current
shortcomings. In particular, the usefulness of HyperCourseware
as a teaching tool should be evaluated, as it is a work in progress
designed specifically for the AT&T Teaching Theater. From
the data it is clear that HyperCourseware falls short in user
satisfaction on a number of factors. These include: (a) Overall
Reaction, (b) System Speed, (c) Reliability, (d) Media Helpfulness
and (e) Media Interference. It did however, fare well regarding
student improvement with computer usage over the semester.
It is necessary to mention here, that although we were able to gather some interesting feedback from the students, we were not able to control for a number of confounding effects of students groups and classes. The confounding factors include: experience levels of students between classes, the range in degree of difficulty and type of course curriculum (Statistics vs. History vs. Computer science vs. English), the level of involvement of the computer and the format of the class - lecture vs. discussion vs. group projects, etc. These confounding factors prevented a truly unbiased analysis among the different software products. Nevertheless, the feedback from the students does provide some groundwork for understanding what is successful and where improvement is needed. Additionally, it gives some suggestions about the advantages and shortcomings of HyperCourseware.
What is important to note, however, is that HyperCourseware is
currently under development, whereas most of the other software
products are fully developed marketed products. The factors regarding
speed and reliability are heavily tied into the process of development
- that is, newer programs that are in the development phase tend
to be slower and less reliable. As for the interference of media
- this is probably due to the high degree of media interaction
used in these courses which utilized HyperCourseware. Again, the
slow speed of processing probably added to the frustration level
of the students and caused them to feel that the media interfered
with their learning. Finally, it is interesting to note that the
students felt that their ability to use the computer improved
significantly during the semester. This is probably due to the
fact that HyperCourseware required a great degree of user interaction.
Overall, many of the features which cause HyperCourseware's low
relative scores, are also factors which should improve through
the normal course of the development cycle. Keeping in mind that
HyperCourseware is a work in progress, it is important to understand
that it, like the Teaching Theater, is being developed not just
for the current classroom, but also for the classroom of the future.
The main goal is to provide an enriching atmosphere for learning
which will inspire students and promote a high level of academic
performance.
ACKNOWLEDGMENTS
Partial support for this project was provided by a grant from
AT&T Information Systems and the Computer Science Center at
the University of Maryland. We wish to thank Walt Gilbert, Project
Director of the AT&T Teaching Theater and Ellen Yu, Project
Manager, for their support as well as the pioneering instructors
and students who are exploring learning in the electronic classroom.
REFERENCES
Black, A., et. al. (1992). Consulting on-line dictionary information while reading.
MRC Applied Psychology , 4, 145-168.
Chin,, J.P., Diehl, V.A., & Norman, K.L. (1988). Development of an instrument
measuring user satisfaction of the human-computer interface. In CHI '88 Conference
Proceedings: Human Factors in Computing Systems, (pp.213-218), New York:
Association for Computing Machinery.
Harper, B. D., & Norman, K.L., (1993). Improving user satisfaction: The
questionnaire for user interaction satisfaction version 5.5. Proceedings of the
Mid-Atlantic Human Factors Conference, Virginia Beach, February,
224-228.
Norman, K.L.(1992). Defining and Assessing Usability in Emerging Systems: A Case
Study of the Electronic Classroom. Submitted for the Proceedings of
Usability Concepts and Procedures: Third Conference on Quality Documentation.
November 19-21, 1992. University of Waterloo.
Norman, K.L. & Carter, L.E. (1992, May). A Preliminary Evaluation of the Electronic
Classroom: The AT&T Teaching Theater at the University of Maryland.
(CAR-TR-621 and CS-TR-2891). Department of Psychology and Human/Computer
Interaction Laboratory, University of Maryland, College Park,
MD.
Norman K.L., & Lindwarm D. (1993). Human/computer interaction in the electronic
classroom. Proceedings of the Mid-Atlantic Human Factors Conference, Virginia
Beach, February, 217-223.
Table 1: Software Products and their associated courses:
SOFTWARE PRODUCT | COURSE TYPE |
Chat | ENGL (English) |
WordPerfect | ENGL (English) |
Quattro Pro | ANTH (Anthropology), BMGT, (Business & Management) ENCE (Civil Engineering), HIST (History) |
SAS | ANTH (Anthropology) |
GEDS | GVPT (Government & Politics) |
Visionquest | BMGT (Business & Management), ENGL (English) |
HyperCourseware | PSYC (Statistics for the Behavioral Sciences), PSYC (Psychology) |
Paradox | CMSC (Computer Science), ENCE (Civil Engineering) |
Table 2:
Mean ratings on a 9-point scale (1-9) :
Notes (regarding the following 3 tables):
** = Significant difference(s) at the 0.01 level - (contrasts designated by bold type)
* = Significant difference(s) at the 0.05 level - (contrasts
designated by bold type)
Comparisons among the various software products were analyzed
by Scheffe F-test.
a) Overall (General) Measurements:
DESCRIPTION | WP / CHAT | QUATTRO PRO / SAS | GEDS | VISION-
QUEST | HYPER-
COURSE- WARE | PARADOX |
Overall Reaction** | 7.15 | 8.00 | 8.17 | 6.27 | 6.11 | 6.34 |
Overall satisfaction | 7.23 | 7.29 | 7.08 | 5.88 | 6.08 | 6.05 |
Overall stimulation* | 7.38 | 7.71 | 7.58 | 6.24 | 5.76 | 6.13 |
Overall ease | 7.46 | 7.07 | 7.25 | 6.69 | 6.62 | 7.05 |
Overall power* | 7.15 | 7.93 | 7.17 | 6.55 | 6.05 | 5.66 |
Overall flexibility | 7.15 | 6.71 | 7.00 | 5.98 | 5.84 | 6.11 |
TOTALS** | 7.25 | 7.45 | 7.37 | 6.27 | 6.08 | 6.22 |
Table 3:
b) Nonspecific Software Measurements:
DESCRIPTION | WP / CHAT | QUATTRO PRO / SAS | GEDS | VISION-
QUEST | HYPER-
COURSE- WARE | PARADOX |
Readability of chars | 7.83 | 8.14 | 7.75 | 7.84 | 7.89 | 7.58 |
Help of highlighting | 7.58 | 7.36 | 8.33 | 7.23 | 7.03 | 7.26 |
Help of screen layouts | 7.40 | 7.36 | 8.17 | 6.63 | 6.79 | 7.12 |
Clear screen sequence | 7.67 | 7.50 | 6.90 | 6.59 | 6.79 | 6.94 |
Consistent use of terms | 7.22 | 7.40 | 7.70 | 6.43 | 7.13 | 6.79 |
Relative terminology | 6.80 | 7.70 | 7.62 | 6.42 | 6.60 | 6.75 |
Clarity of messages | 7.54 | 7.27 | 7.50 | 6.57 | 6.86 | 6.70 |
Helpfulness of messages | 8.00 | 7.20 | 6.75 | 6.48 | 6.68 | 6.94 |
Computer informs* | 6.80 | 7.45 | 6.75 | 5.72 | 5.82 | 5.85 |
Help of Error messages | 6.18 | 6.80 | 6.73 | 5.96 | 4.61 | 5.94 |
Ease of learning | 8.00 | 7.17 | 7.42 | 6.49 | 7.08 | 6.90 |
Exploring Encouraged* | 7.33 | 7.55 | 8.00 | 6.38 | 6.74 | 6.89 |
Ease of remembering | 6.67 | 6.77 | 7.08 | 6.08 | 7.24 | 6.29 |
Straightforward tasks | 6.91 | 7.08 | 7.67 | 6.54 | 6.66 | 6.89 |
Clarity of Help msgs | 7.00 | 6.91 | 6.67 | 6.17 | 5.80 | 6.08 |
Clarity of suppl. refs | 7.30 | 7.50 | 6.30 | 5.32 | 5.89 | 5.36 |
System speed** | 7.40 | 8.50 | 7.08 | 6.69 | 5.24 | 5.82 |
Reliability** | 7.40 | 8.27 | 6.92 | 6.72 | 5.50 | 6.05 |
Noise of system | 8.40 | 8.58 | 8.58 | 7.67 | 8.32 | 7.35 |
Ease of correction | 7.50 | 7.83 | 7.75 | 6.67 | 6.50 | 6.50 |
All levels of experience | 7.00 | 6.42 | 7.00 | 5.84 | 6.29 | 6.51 |
Table 4:
c) AT&T Teaching Theater-Specific Measurements:
DESCRIPTION | WP / CHAT | QUATTRO PRO / SAS | GEDS | VISION-
QUEST | HYPER-
COURSE- WARE | PARADOX |
Instructor integration | n/a | 7.77 | 6.80 | 6.26 | 6.79 | 6.58 |
Media interference* | n/a | 8.08 | 7.30 | 5.84 | 5.16 | 6.17 |
Media helpfulness* | n/a | 8.17 | 7.00 | 6.12 | 6.05 | 5.42 |
Computer helpfulness* | n/a | 7.79 | 7.36 | 6.18 | 6.16 | 5.70 |
Tech used to potential | n/a | 7.36 | 6.67 | 5.88 | 6.00 | 5.06 |
Adequate time to learn* | n/a | 6.43 | 7.58 | 5.52 | 6.27 | 6.47 |
Ability to attend | n/a | 7.08 | 7.75 | 6.23 | 6.95 | 6.65 |
Improvement** | n/a | 8.36 | 8.50 | 5.69 | 6.27 | 4.08 |
Accessibility from WAM | n/a | 4.64 | 4.54 | 5.71 | 4.09 | 4.68 |
n/a = No data collected for this group.
Graph 1: Scatterplot of Overall Ratings (for all software products
combined):
One Standard Deviation Error Bars for Columns:
X1...X36
Graph 2: Scatterplot of HyperCourseware Ratings:
One Standard Deviation Error Bars for Columns: X1...X36