Methods for observing software users in the workplace will become
increasingly important as the number of people using computers
grows and developers improve existing systems. Successful redesigns
rely, in part, on complete and accurate evaluations of the existing
systems. Based on our evaluation experience, we have derived a
set of practical guidelines to be used by designers in preparing
for the evaluation, performing the field study, analyzing the
data, and reporting the findings. By providing a general framework
based on ethnographic research, we hope to reduce the likelihood
of some common problems, such as overlooking important information
and misinterpreting observations. Examples from our ongoing work
with the Maryland Department of Juvenile Justice are used to illustrate
the proposed guidelines.
KEYWORDS: Ethnography, Anthropology, Participant observation,
Design methods, Redesign, Evaluation, User studies
There is currently great enthusiasm for user interface (UI) research
methods in developing new interfaces. These include focus groups,
surveys, scenario elicitations, cognitive walkthroughs, controlled
experimentation, usability testing and usability reviews (Chin,
Diehl & Norman, 1988; Karat & Karat, 1992; Dumas &
Redish, 1993; Hix & Hartson, 1993; Nielsen, 1993; Shneiderman,
1992). These methods can be dramatically cost effective in speeding
development, improving quality and reducing costs, but in the
rich literature on processes for UI research and development it
is difficult to find reports about how to evaluate user interfaces
by observing users in their workplace. We recognize that Industrial/Organizational
Psychology (Muchinsky, 1993) embodies organizational field studies
in general but our focus here is on studying the human-computer
interface and presenting practical guidelines for UI designers.
Successful redesigns depend, in part, on a thorough and accurate
evaluation of the existing system. Vanniamparampil, Shneiderman,
Plaisant, and Rose (1995) discusses the effort and outcomes of
several reengineering projects. In our research, we have found
observation, hands-on experience, and questionnaires to be useful
evaluation techniques. The focus of this report is how to successfully
evaluate systems by applying ethnographic methods, such as participant
observation. According to Muller, Wildman, and White's taxonomy
(1993) of participatory design practices, ethnographic methods
are one of the earliest techniques that can be utilized in the
software development cycle.
As a guide to observational approaches, we explored the ethnographic
literature (Hammersly & Atkinson, 1983; Howard, 1989; Pelto
& Pelto, 1978; Spradley, 1980) and have attempted to draw
parallels, make distinctions, and learn from the methods developed
during this century. An "ethnographer participates, overtly
or covertly, in people's daily lives for an extended period of
time, watching what happens, listening to what is said, asking
questions (Hammersley & Atkinson, 1983)." As ethnographers,
UI designers gather insight into the overall context of an organization.
UI designers differ from traditional ethnographers in that in
addition to understanding, UI designers observe systems being
used for the purpose of changing and/or improving the systems.
While traditional ethnographers tend to immerse themselves in
cultures for weeks or months, UI designers need to limit this
process to a period of days or even hours, and still obtain the
relevant data needed to influence a redesign.
Today's anthropologists are broadening their research scopes to
include cultures found in corporate organizations (Vaske &
Grantham, 1989). Based on their own experience, Hughes, King,
Rodden, and Anderson (1995) have identified several different
uses of ethnography in system design. Early work by Suchman (1983)
recommended a new line of ethnographic research for office systems
design. More recently there have been several instances where
ethnographic methods have successfully been used in the software
development process. Nardi and Miller (1990) interviewed several
users to understand how spreadsheets are developed. In another
study, a manual air traffic control system was observed for the
purpose of designing a new electronic system (Bentley, Hughes,
Randall & Sawyer, 1992; Hughes, Randall & Shapiro, 1992).
We have used observation techniques in the redesign of several
user interfaces (Shneiderman, 1993), including ACCESS, an on-line
catalog for the Library of Congress, Corabi's telepathology workstation
and the Maryland Department of Juvenile Justice's information
system. While several case studies applying ethnographic methods
can be found in the literature, we have not found documented methods
for how to conduct these evaluations. In this report, we propose
a general methodology for applying ethnographic techniques to
the redesign of user interfaces.
Successful evaluations should take a system's unique attributes
into account. For instance, some systems may only have a single
user while others may have thousands of users whose needs are
similar (e.g., airline reservationists) or different (e.g., writers
or artists). As ethnographers, designers can adapt the method
used to accommodate unique situations. Our method uses an iterative
approach of continually refining goals and the strategies for
achieving them. It is based on principles of participatory design
described by Carmel, Whitaker, and George (1993) and Miller (1993):
The goal of an evaluation is to obtain the necessary data to influence
system redesign. Unfortunately, it is very easy to misinterpret
observations and overlook important information. Providing a general
ethnographic framework reduces the likelihood of these problems.
Based on our experience, we have derived guidelines for preparing
for the evaluation, performing the field study, analyzing the
data, and reporting the findings:
These guidelines are presented from the perspective of outsiders
evaluating a system. While inside evaluators may require a shorter
preparation time, it is generally easier for outsiders to remain
objective. To illustrate these guidelines, examples from our work
with the Maryland Department of Juvenile Justice (DJJ) are used
throughout this report. We are currently under contract with DJJ
to make recommendations on how to improve ISYS (Information System
for Youth Services), a terminal-based system to support the processing
of approximately 50,000 case referrals per year for delinquent
youth behavior. ISYS is used by approximately 600 DJJ employees
in offices and facilities across the state.
In a six month time frame, we conducted 22 field visits. Multiple
interviews took place during many of the visits. Each interview
lasted anywhere from 30 minutes to several hours. Prior to the
interviews, the Questionnaire for User Interaction Satisfaction
(QUIS) was customized to evaluate ISYS and administered to 332
personnel (Slaughter, Norman & Shneiderman, 1995).
Adequate preparation can be the key to a successful evaluation.
Preparation not only makes a designer more familiar with the intricacies
of a system, it also enhances credibility during the field study
and promotes productive field visits.
Understand organization policies and work culture.
Learning how an organization operates provides a basic understanding
of the user's work environment. This can be initiated by talking
with managers and reading organization literature. Annual reports
and multi-year plans are useful for getting an overview of the
organization and its future directions. Find out how management
measures the organization's success, how information is communicated,
who is responsible for making decisions, what the policies are
for promotion, how employees are motivated, and what the consequences
are for not using the system correctly. All of these play a role
in how users interact with the system. While management can help
in understanding the organization, do not assume that management
necessarily knows what is really going on in the company.
Example: The primary ways
we learned about DJJ were through discussions with upper management
and reading their three year plan. We learned several things that
helped us during our field study. First, communication is a big
problem for DJJ since its employees are scattered around the state.
Another problem we discovered is that most DJJ employees are severely
overworked due to the large volume of case referrals. Our research
also suggested that motivation and data accuracy might be a problem
since DJJ employees were constantly being reminded that "information
systems are only as good as the data put in them."
Familiarize yourself with the system and its history.
Before the field study, the designers should become as familiar
with the system as novice users. Start learning about the system
by reading the available documentation, like the user's guide
and any records of user suggestions and/or complaints. Better
yet, play the role of a novice user by attending training sessions
and getting some hands-on experience. Hands-on experience is one
of the better ways to become familiar with a system.
Learn who the designers of the current system were, where they
are now and whether there have been prior redesign efforts. Being
familiar with previous efforts helps prepare the interviewer for
any preconceived notions that employees may have about the current
effort. Extra steps may need to be taken to overcome worker skepticism
if there have been failures. To avoid the same mistakes, take
the time to understand why a particular effort failed. Be sensitive
to the fact that the failed attempt might have been undertaken
in house by current employees.
Example: To familiarize
ourselves with ISYS, first we read the user's guide. This clearly
demonstrated to us that the current user's guide does not provide
adequate information for a new ISYS user. From our own hands-on
experience with ISYS, we immediately found several sources of
user frustration, such as the 11 steps required to log in and
the numerous breakdowns. From discussions with the MIS staff,
we learned about their frustrations regarding outdated software/hardware,
limited documentation, no access to the original external implementor,
and limited personnel.
Set initial goals and prepare questions.
The goals set the focus of the field study. They might include
finding ways to improve data accuracy, increase system usage,
or reduce user workload. Keeping the goals in mind, prepare some
questions to ask the users during the field visits, such as:
If one of the goals is to improve data accuracy, some additional
questions might include:
Example: In the DJJ project,
our initial goals were to improve user satisfaction, increase
data accuracy, increase system use, and improve productivity.
Our questions were phrased accordingly.
Gain access and permission to observe/interview.
When identifying the users to interview, consider the complexity
of the system, the various functions of the users, and the time
available. For complex systems, it may be necessary to observe
some areas several times. Multiple visits can help resolve contradictions,
clarify misunderstandings, and obtain different perspectives.
It is easier to decide which areas to observe multiple times once
the field study has started. Having a central contact who is both
familiar with the system and has a lot of user contacts is essential
when studying complex systems. This contact can help coordinate
the interviews, ensure adequate coverage of important system aspects,
and provide immediate feedback on the findings.
Permission to perform the field visits needs to be granted. Do
not assume that all users will want to be observed. Convincing
workers that your study is not a threat can be far from easy (Howard,
1989). It helps to explain the goals of the project, what the
interview involves, and how the success of the project can benefit
them personally. Management can encourage employees to participate
by giving them permission to take time from their work schedules.
However, participation should be voluntary. If workers are forced
to participate, they may not be very cooperative.
Before the field study begins it must be decided whether or not
the identity of the users observed/interviewed will be kept confidential.
Management may want to hold workers accountable for what is said,
while the users may not be as cooperative if their anonymity cannot
Example: Because of the
complexities of ISYS, having a central contact with a good overall
understanding of the system has been critical to our effort primarily
due to the fact that different DJJ offices have varying procedures
for using ISYS. Our contact has also played a key role in assuring
employees that our team was cleared to view sensitive youth records.
Performing the field study can be the most rewarding part of the
evaluation process. One of the keys to a successful field study
is recognizing that workers are intelligent, creative, and productive
contributors to the development process (Miller, 1993). Since
users are the people most familiar with the system being redesigned,
their opinions and insights are invaluable.
Establish rapport with managers and users.
At the start of each field visit, convey the project goals and
how they are going to be achieved. As an agent of change, be careful
to provide enough information to motivate the participants without
being deceptive. Give realistic examples of how the redesigned
system could benefit the person in terms of reduced workload,
increased satisfaction, and so on.
Taking the time to address any concerns that users have regarding
the interview process will help put them at ease. It also helps
to emphasize that the objective is to evaluate the system not
its users. Users will also be interested in the backgrounds and
qualifications of the designers.
Try to remove any obstacles that might interfere with an open
interview. This can be accomplished through some "impression
management" (Hammersley & Atkinson, 1983), such as dressing
appropriately and being familiar with the terminology used. The
contact person can be useful in determining the unwritten rules
of the workplace. Being familiar with the language used not only
facilitates communication, it also conveys credibility. Also,
be aware of differences in professional, organizational, and educational
backgrounds that might interfere.
Example: DJJ employees
have a strict dress code so we were careful to dress appropriately
(instead of wearing our usual jeans) and act in a professional
manner. Because we did not learn all the DJJ terminology ahead
of time, it became obvious in the initial interviews that we were
only able to record what was being said and what was being done.
Once we started to understand more of the terminology we were
able to be much more active during the interviews.
Observe/interview users in their workplace and collect subjective/objective
Various types of data can be collected during the field visits:
This data is collected by watching, listening, and questioning
users. Plan to spend approximately at least an hour for each visit.
Complex aspects of the system may require more time. The goal
is to spend enough time to get a good overview of how the system
is normally used. If the visit is too short, important information
may be missed. Subsequent visits can be used to confirm the importance/frequency
of findings from earlier visits. A thorough field visit gathers
data pertaining to the users, their work environment, and the
tasks they perform.
One of the basic principles of design is "know your audience."
Begin by getting an overview of the user's daily routine. Inquire
about their educational and professional background, including
how long they have been with the organization. Learn about the
user's experience with computers (e.g., other applications they
use, whether or not they have a computer at home, etc.), record
how long they have used the current system, and classify how they
learned it (e.g., self taught, formal training, peers).
A user's physical work environment directly influences system
use. Physical objects in the work environment may hint at inadequacies
of the system. Look for things on the walls, like cartoons or
instruction sheets, "cheat" sheets, the availability
of documentation and their age, paper files, the amount of desk
space available, the quantity and quality of the hardware, the
lighting, etc. Also, watch for interruptions, like the phone ringing
or people stopping by to ask questions.
In order to get a realistic picture of how a system is used, it
is important to observe users doing actual work not just performing
demonstration tasks. Look for repetitive tasks, evidence of duplicate
work (e.g., parallel paper system), problem driven solutions,
instances when the system is slow or crashes, and functionalities
that are rarely used. Also watch for cases when the user has to
make notes to remember something or makes a print out simply to
fax it somewhere else.
Several things can be done to promote more productive field visits.
Suggest that the users think aloud as they work and encourage
them to give honest feedback. Using anecdotes from previous visits
helps elicit comments. It may be necessary to reassure users more
than once that what is disclosed will not affect their job. Emphasize
the importance of user opinions. Users are the best qualified
to know what their jobs require and how to improve them. When
a person says "I'm not a computer expert so I can't help",
explain how the system should be designed for both novices and
experts. Also, encourage creativity. Often, users hesitate to
share ideas because they think the ideas are unrealistic or too
trivial. For users not familiar with the available technology,
help by prompting them with questions like "If you could
do this, would it help?"
During the visits, try to measure aspects of the system that are
relevant to the project goals, such as the frequency and duration
of particular tasks, the most common errors, system reliability,
etc. These limited measurements can be complimented with automatic
data collection that can cover a longer period of time (Shneiderman,
Brethauer, Plaisant & Potter, 1989). One possibility is to
request that counters be installed in the system. This data can
be compared with the user's impressions of the system and to the
estimates collected during the visits.
Example: The majority
of data we collected during our interviews was subjective qualitative.
We watched users scribble down notes to remember information in
between screens, use "cheat" sheets to log in to the
system, and pound the keyboards when the system crashed. We also
observed frustrated users trying to understand cryptic error messages,
numerous instances of duplicate work, and several cases where
users entered incorrect data. We recorded the number of times
the system crashed, how many terminals were available, and the
estimated number of screens used. The field visits also illustrated
the role ISYS played in the worker's daily routine.
Follow any leads that emerge from the visits.
Some of the best field visits are often the result of chance encounters.
Keeping a flexible schedule allows any leads that come up to be
followed up on as soon as possible. Flexibility also allows users
to focus on what is most important to them.
Example: One of the most
dramatic interviews we had happened as a result of a chance encounter.
We were interviewing someone else when they suggested that we
talk to one of their colleagues, so we did. During the interview,
we observed the worker doing a routine task which caused the keyboard
to lock up. The worker proceeded to pound on the keyboard which
caused the system to crash which caused all their work to be lost.
The worker got up and left the room disgusted with ISYS.
Record your visits.
It is very important that what is seen and heard during a field
visit is objectively recorded. There are several ways in which
a visit can be recorded. One possibility is to tape or video record
the visit, assuming permission is given. However, recording makes
some people uncomfortable. While traditional ethnographers want
to record visits in their entirety for archival purposes, designers
are more interested in summarized reports. The more suitable approach
is to use pen and paper, or possibly a laptop computer. Taking
pictures or videos are useful for realistically depicting aspects
of the system that are hard to capture in written text, such as
the work environment.
It is important to be as descriptive as possible when recording
the interviews because the interview notes contain the data needed
for the analysis. The contents of the notes can be verified by
showing them to the person interviewed, or preferably, to an unbiased
Example: In the DJJ project,
it has proved useful to jot down notes during an interview and
then type them up later in more detail. This approach seems to
allow us to better organize our observations plus it helps re-emphasize
what we observed. The interview notes have also been a good medium
through which to communicate our findings to members of our team
who could not participate.
Begin analyzing the data collected as soon as possible. Starting
the analysis before the field study is completed allows refinements
to be made. The guidelines in this section are for a simple, low-cost,
rapid analysis technique. Nayak, Mrazek, and Smith (1995) describe
several other techniques including affinity diagramming, data
visualization, prioritizing problems, scenarios of use, usability
specifications, and style guides.
Compile the collected data in numerical, textual, and multimedia
When identifying how to best organize the data, it is important
to consider the project goals and how they are going to be achieved.
For instance, if statistical information is going to be used to
evaluate some aspect of the current system, the numerical data
needs to be compiled. The large amount of textual data can be
made more manageable if it is organized in categories like scenarios,
anecdotes, complaints, and suggestions. Multimedia data, like
pictures, videos, and documents, that will influence the redesign
should also be compiled (Goldman-Segall, 1994).
Example: We organized
our textual data primarily by complaints and suggestions. Since
our primary goal was to improve the user interface, we were careful
to separate environmental issues from user interface issues. We
also compiled the forms that were filled out manually, as input
and output documents, since we were interested in improving data
accuracy and user productivity.
Quantify data and compile statistics.
How the data is organized will suggest what needs to be quantified
and the statistics to be calculated. The number of times a particular
problem was observed or the number of people who made the same
suggestion are examples of data to quantify. Later on in the development
process, statistics, like the average time to perform a particular
task, can serve as benchmarks for measuring the effectiveness
of the redesigned system.
Example: We did not calculate
any statistics in the DJJ project. In a few cases, we did count
the number of times we observed a particular problem, like users
with expired accounts, or heard a particular complaint. In hindsight,
it might have been useful to measure the time required to perform
some standard tasks, like logging in to ISYS and adding a new
Reduce and interpret the data.
Because the interpretation of the data can be used to support
the redesign, care should be taken to remain objective. One way
around "observer bias" is to provide the chain of reasoning
used (Nayak et al., 1995). Having an informant is also helpful.
An informant is a person who can help explain those aspects of
the system that were not able to be directly observed and can
check any inferences that have been made (Hammersly & Atkinson,
Identify problem areas by looking for common threads in the data.
Possible solutions may be found in the user's suggestions. Exploring
how information was communicated can also suggest possible design
Discuss these interpretations with the users. Ask them whether
or not the findings make sense to them. The users can provide
valuable insight needed to interpret the data.
Example: From the data,
we noted that every single person interviewed complained that
the log in procedure was too long. Over a third of the interviewees
expressed that they used the system solely for data entry without
being benefited personally. Poor communication and training was
an another overriding theme. The data indicated that ISYS is the
first experience many of its users have had with a computer. Poor
access due primarily to a lack of functioning equipment was also
identified as a major problem.
Refine the goals and the process used.
As the field study proceeds and the initial analysis begins, the
goals of the project and the process for achieving them should
be refined. Changes might include the number of users to interview,
the questions to ask, the system aspects to focus on, and the
types of data to collect. The increased knowledge of the system
and its problems allows the interviewers to better empathize with
the user's difficulties, suggest possible solutions, and elicit
user reactions. This usually allows a more active dialogue between
the interviewer and the participants who are more likely to express
Example: Originally, we
thought the data was inaccurate primarily because it was not being
entered in a timely fashion. After a few visits, we realized that
the data inaccuracy problem was also due to duplicate youth records
and incorrect data entry. In subsequent visits, we paid more attention
to identifying the sources of these two problems. Our primary
contact read our visit reports right after the visits and provided
clarifications. Our early findings were discussed with users during
the later visits.
Once the data has been analyzed, the finding should be reported
to the organization. Reporting the findings helps keep the organization
involved in the redesign process.
Consider multiple audiences and goals.
Multiple audiences will be interested in the findings for different
reasons. Management will be interested in learning how their system
"measures up" and may use the findings as justification
for further action. Some people will simply be curious to hear
what was learned. Others may use the report to measure the design
teams' understanding of the system.
Example: When we reported
our initial findings to DJJ, our audience consisted primarily
of supervisors including the head of DJJ, the executive directors,
and one member of the software staff. Since the majority of our
audience was not very computer literate, we were careful to use
appropriate terminology in our report.
Prepare a report and present your findings.
Findings presented in an objective report allow the audience to
draw their own conclusions. Nayak et al. (1995) also suggests
presenting the findings in a positive tone to avoid finger pointing.
Illustrating specific problems using quantitative data and anecdotal
evidence is also helpful.
Presenting the findings is an opportunity to educate the audience
about their own organization. It is likely that the audience has
not heard everything that will be presented. In the case of large
complex systems that are distributed over multiple offices, it
is possible that the design team is among the few people with
a good overview of the system.
Example: We tried to stay
as objective as possible when we reported our findings. We used
anecdotal evidence and specific numbers to emphasize important
issues. Anecdotal evidence definitely had a strong impact on the
audience. Some of what we reported was already known and some
of the information was new. Even reporting information that was
already known seemed important to our audience since we were viewed
as an objective third party.
INFLUENCES ON REDESIGN
Based on our findings, we proposed 28 short term recommendations
to DJJ that required no hardware changes and minimal software
changes (Rose & Vanniamparampil, 1994). These recommendations
were chosen based on the severity of the problem and the frequency
of the complaint. DJJ employees added a few of their own recommendations
which indicated that we missed some critical issues during our
interviews. It was difficult to decide what fixes to recommend
for the current interface while the new system is being developed.
For each recommendation, we estimated the payoff and the effort
required to implement. Based on these estimations, DJJ decided
to take action on all of our recommendations.
Our observations confirmed that several of DJJ's business practices
need to be changed. DJJ has created several focus groups to generate
process maps of their current business procedures. These maps
will be used to evaluate their current procedures and make necessary
changes. These changes will directly influence the new system
design. DJJ is also investigating their hardware needs as a result
of our observations.
We are currently developing three prototypes for their new system
and getting feedback from users. These prototypes are in direct
response to needs voiced during the interviews. We have shown
the prototypes to approximatly 60 DJJ employees ranging from high
level supervisors to case workers. Initial reactions have been
very positive. We plan on continuing to work with the users to
refine these prototypes.
Using an applied ethnographic method has facilitated our design
process in several ways. It has increased our trustworthiness
and credibility since we learned about the complexities of DJJ
firsthand by visiting their workplace. The visits allowed us to
develop working relationships with several end users with whom
we continue to discuss ideas. Most importantly the users have
become increasingly active participants in the design of their
Ethnographic methods based on principles of participatory design
have proven to be an effective tool in user interface redesign.
Our method is based on an iterative approach that begins with
good preparation. The data needed for analysis is collected during
the field visits. The data analysis is started as early as possible
to allow refinement of the project goals and the strategies for
achieving them. The results are then objectively reported to the
As the number of existing systems grows, so will the number of
systems being redesigned. Good designers may also need to become
effective ethnographers. Providing practical guidelines for evaluating
systems will assist designers in their efforts to perform comprehensive
and accurate evaluations in a timely manner.
We thank Ricki Goldman-Segall for sparking our interest in ethnographic
approaches and providing us with some references. We thank Chris
Cassatt, Ajit Vanniamparampil, Brett Milash, and Laura Slaughter
for their contributions to the DJJ project. We are also grateful
to the DJJ employees that participated in our study. We also thank
Kent Norman and Mick Couper for providing us with valuable feedback
on our early drafts. The preparation of this report was supported
by funding from the Maryland Department of Juvenile Justice.
Bentley, R., Hughes, J., Randall, D., Rodden, T., Sawyer, P., Shapiro, D., and Sommerville, I., (1992), "Ethnographically-informed Systems Design for Air Traffic Control", Proceedings for CSCW '92 - Sharing Perspectives, ACM, Toronto, Canada, 123-129.
Carmel, E., Whitaker, R., and George, J., (1993), "PD and
Joint Application Design: A Transatlantic Comparison", Communications
of the ACM, 36, 4, ACM, New York, 40-47.
Chin, J., Diehl, V., and Norman, K., (1988), "Development
of an instrument measuring user satisfaction of the humanñcomputer
interface", Proceedings of CHI'88óHuman Factors
in Computing Systems, ACM, New York, 213ñ218.
Dumas, J., and Redish, J., (1993), A Practical Guide to Usability
Testing, Ablex Publishing Corp., Norwood, NJ.
Goldman-Segall, R., (1994), "Whose Story Is It, Anyway? An
Ethnographic Answer", IEEE Multimedia, 1, 4, 7-11.
Hammersley, M., and Atkinson, P., (1983), Ethnography Principles
and Practice, Routledge, London and New York.
Hix, D., and Hartson, H., (1993), Developing User Interfaces:
Ensuring Usability through Product & Process, John Wiley
& Sons, Inc., New York.
Howard, M., (1989), Contemporary Cultural Anthropology
(Third Edition), Glenview, IL.
Hughes, J., King, V., Rodden, T., and Anderson, H., (1995), "The
Role of Ethnography in Interactive Systems Design", Interactions,
2, 2, ACM, New York, NY, 56-65.
Hughes, J., Randall, D., and Shapiro, D., (1992), "Faltering
from Ethnography to Design", Proceedings of CSCW '92 -
Sharing Perspectives, ACM, Toronto, Canada, 115-122.
Karat, C., and Karat, J. (Editors), (1992), "Some Dialogue
on Scenarios", SIGCHI Bulletin, 24, 4, ACM, New York,
Miller, S., (1993), "From System Design to Democracy",
Communications of the ACM, 36, 4, ACM, New York, 38.
Muchinsky, P., (1993), Psychology Applied to Work: An Introduction
to Industrial and Organizational Psychology (4th Edition),
Brooks-Cole Publishing Co., Pacific Grove, CA.
Muller, M., Wildman, D., and White, E., (1993), "Taxonomy
of PD Practices: A Brief Practioner's Guide", Communications
of the ACM, 36, 4, ACM, New York, NY, 26-27.
Nardi, B., and Miller, J., (1990), "An Ethnographic Study
of Distributed Problem Solving in Spreadsheet Development",
Proceedings of CSCW '90, ACM, Los Angeles, CA, 197-208.
Nayak, N., Mrazek, D., and Smith, D., (1995), "Analyzing
and Communicating Usability Data", SIGCHI Bulletin,
27, 1, ACM, New York, 22-30.
Nielsen, J., (1993), Usability Engineering, Academic Press,
Pelto, P. and Pelto, G., (1978), "Ethnography: The Fieldwork
Enterprise", In J.J. Honigmann (Editor) Handbook of Social
and Cultural Anthropology, Rand McNally, Chicago, IL.
Rose, A. and Vanniamparampil, A., (1994), "Short Term Recommendations
for Improving the ISYS User Interface", ISYS User Interface
Project Progress Report, Human-Computer Interaction Laboratory,
University of Maryland, College Park, MD, unpublished.
Shneiderman, B., Brethauer, D., Plaisant, C., and Potter, R.,
(1989), "Evaluating Three Museum Installations of a Hypertext
System", Journal of the American Society for Information
Science Special Issue on Hypertext, 40, 3, 172-182.
Shneiderman, B., (1992), Designing the User Interface: Strategies
for Effective Human-Computer Interaction (2nd Edition), Addison-Wesley
Publishing Co., Reading, MA.
Shneiderman, B. (Editor), (1993), Sparks of Innovation in Human-Computer
Interaction , Ablex Publishing Corp., Norwood, NJ.
Slaughter, L., Norman, K., and Shneiderman, B., (1995), "Assessing
Users' Subjective Satisfaction with the Information System for
Youth Services (ISYS)", Proceedings of the Third Annual
Mid-Atlandtic Human Factors Conference, Blacksburg, VA, 164-170.
Spradley, J., (1980), Participant Observation, Holt Rinehart
& Winston, New York.
Suchman, L., (1983), "Office Procedure as Practical Action:
Models of Work and System Design", ACM Transactions on
Office Information Systems, 1,4, ACM, 320-328.
Vanniamparampil, A., Shneiderman, B., Plaisant, C., and Rose,
A., (1995), "User Interface Reengineering: A Diagnostic Approach",
CAR-TR-3459, University of Maryland, College Park, MD.
Vaske, J., and Grantham, C., (1989), Socializing the Human-Computer
Environment, Ablex Publishing Corp., Norwood, NJ.