*Jason Ellis is now a Ph.D. student in the GVU Center
at GeorgiaTech. He can be reached at email@example.com,
The Human-Computer Interaction Laboratory (HCIL) and the Maryland Department of Juvenile Justice (DJJ) have been working together to design the ProgramFinder, a tool for selecting programs for troubled youths ranging from drug rehabilitation centers to secure residential facilities. The seemingly straightforward journey of the ProgramFinder from an existing user interface technique to a product design required the development of five different prototypes which involved user interface design, prototype implementation, and selecting search attributes. While HCIL's effort focused primarily on design and implementation, DJJ's attribute selection process was the most time consuming and difficult task. We also found that a direct link to DJJ's workflow was needed in the prototypes to generate the necessary "buy-in." This paper analyzes the interaction between the efforts of HCIL and DJJ and the amount of "buy-in" by DJJ staff and management. Lessons learned are presented for designers.
Technology transfer, visualization, dynamic query, legal systems, matching
For the past two years, the Human-Computer Interaction Laboratory (HCIL) has been working with the Maryland Department of Juvenile Justice (DJJ) to redesign the user interface of their information system which is used to process approximately 50,000 juvenile complaints per year. The first year consisted of performing 22 field visits, administering the Questionnaire for User Interaction Satisfaction (QUIS) to 332 DJJ personnel, and making short and long term user interface recommendations . In the second year, we continued with extensive prototyping with an emphasis on supporting DJJ's workflow with respect to youth case management. Our role was to propose novel user interface designs for DJJ's consideration. The actual system will be implemented by another party.
One case management function that we identified as being a candidate for the new system involves selecting the best program for a youth. These programs range from community-based drug treatment programs to secure residential facilities. Currently, DJJ searches through a 4-inch manual of about 250 programs to find the best one. Not only is this very time consuming but there is also the potential bias of choosing the first program found, as opposed to the one that best suits the youth.
We believed that HCIL's earlier dynamic query (DQ) research could be applied to this problem. Most of DJJ's employees are novice computer users. Several have never worked in a graphical windows environment or even used a mouse, and only a few are familiar with query languages like the Structured Query Language (SQL). DQ applications allow users to make queries very quickly and easily by adjusting sliders and selecting buttons while the search results are continuously updated in a visual display (e.g., x/y scatterplot, map, etc.) . DQ applications are particularly good for novice users since they do not require a special query language to be learned, invalid queries cannot be formed, and users see the results of adjusting a value immediately. The ProgramFinder was designed to allow DJJ to quickly and easily select the best program(s) for a youth from among all the programs matching user specified attributes.
One of the original dynamic query prototypes was the HomeFinder, a tool for browsing homes for sale in an area (Figure 1) . Instead of the HomeFinder's map of Washington, D.C., the ProgramFinder plots the available programs on a map of Maryland (Figure 2). Adjusting the controls updates the display which shows a dot for each program that matches. A click on a program provides more details and the press of a button generates the paperwork.
This paper describes the seemingly straightforward conversion of the ProgramFinder from a research prototype to a real product, and analyzes the interaction between the efforts of HCIL and DJJ and the amount of "buy-in" of DJJ staff and management (i.e., how excited they seemed to be about the prototype). We found that many levels of prototyping were still needed (five in all) and that the choice of the search attributes was the most time consuming (and the most conflict generating) task. A direct link to the workflow was also needed in the prototypes to generate the necessary "buy-in."
Figure 1. Original HomeFinder research prototype
Figure 2. Final ProgramFinder prototype
The process of evolving the ProgramFinder design from the original HomeFinder concept (Figure 1) to the final design (Figure 2) involved five different prototypes:
The primary effort involved in developing each prototype consisted of customizing the user interface, implementing the prototype, and deciding on the search attributes. The level of effort in each of these categories and the amount of user "buy-in" varied by prototype.
The initial IVEE prototype was developed in a few hours to illustrate the ProgramFinder concept to DJJ. Once we had DJJ's go ahead, the Initial Customization prototype was developed. This is when development started to focus more intently on DJJ's workflow and as a result DJJ's "buy-in" increased dramatically. DJJ also began working harder on choosing the selection attributes.
Over time, it became obvious that workers had vastly different opinions from management. A comparison prototype was developed to illustrate the workers' ideas to management. After considerable debate, management decided on a set of attributes to use and a Testing prototype was developed so preliminary usability testing could be performed.
Once these attributes were chosen, the design effort gained steam as DJJ began reacting to the details in the prototype and requesting many modifications. The Testing prototype required increased implementation effort because there was little working functionality in the previous prototypes.
The first prototype (Figure 3) was built in a few hours using the Information Visualization and Exploration Environment (IVEE) . IVEE automatically creates DQ interfaces for specified datasets. The dataset used to generate the ProgramFinder was entirely mocked up by HCIL.
The major drawback of using IVEE was that it ran on Sun workstations and DJJ only uses PCs. For demonstration purposes, we resorted to using a slide show of IVEE screens in conjunction with a live demo of the HomeFinder to show the smooth DQ interaction. DJJ's initial reactions were positive and they asked us to continue.
The implementation and attribute selection efforts increased during this phase. DJJ started to get more involved as well. They provided us with detailed information about their placement process and proposed a set of attributes. The attributes allowed them to specify the "best" value (i.e., the ideal value for the youth) within a range of values. This allowed the selected programs to be rank ordered but required a few modifications to the range slider:
This prototype was developed using Borland's Delphi on PCs (Figure 4) which allowed us to use DJJ's machines and to customize the interface, neither of which were possible with IVEE. A "Send Packet..." button was added to demonstrate how users could select a program and then automatically generate the required paperwork.
DJJ's "buy-in" jumped dramatically when they were shown this prototype. They were very excited to see a DJJ document pop up when the "Send Packet..." button was pressed. They immediately started discussing how they could market ProgramFinder to other juvenile justice agencies. They also started envisioning other ways the ProgramFinder could be used. For example, they suggested adding a referral log so the acceptance and rejection patterns of different programs could be monitored. They also envisioned using the ProgramFinder to coordinate with other Maryland agencies which would allow them to choose from over 2000 programs (compared to only 250 currently).
At this stage, DJJ's involvement moved from a casual exploratory effort to more serious product design work. We believe this was due primarily to the customization effort, even though it was relatively simple. Ironically, the ProgramFinder was not a tool DJJ anticipated needing initially.
Shifting toward a more serious design effort, we started working more closely with the DJJ staff and discussing their needs in detail. Up to this point, our discussions had been primarily with middle and upper management. While the users were pleased with the general concept of the ProgramFinder, they were concerned that the proposed attributes did not correspond to how they currently did their jobs. The comparison prototype was developed to illustrate their proposed changes to management (Figure 5).
The major effort in creating this prototype was defining the new attributes which were significantly different from the attributes proposed by management. Instead of choosing a range of values, workers wanted to rank each value on a scale from "not important" to "required." Not important attributes would be ignored, low importance and high importance attributes would be used for color coding programs. An "attribute ranker" widget was designed to facilitate these selections. Some other minor changes included reducing the number of program types in the legend, adding more fields to the details area, and creating a "Show Referral Log" button.
At this point, there were two significantly different prototypes that needed to be brought to some consensus. Essentially, the workers wanted a system that supported how they currently did their jobs while management was interested in redefining the program selection process. Programs are currently selected according to a "type of care" paradigm where each program provides a description of the services they provide. Management is interested in establishing a new method where programs are described by a continuum of services measured by levels of restrictiveness (custody) and intervention (treatment). This would allow DJJ to directly link a youth's risk assessment (restrictiveness required) and needs assessment (intervention needed), which are measured by the same type of levels, to the selection process.
Management was presented with both prototypes and the strengths and weaknesses of each were discussed. After a month of deliberation, management chose the Initial Customization prototype. Management's rationale was that the attributes in the Comparison prototype did not engage the users in the selection process as much as those in the Initial Customization prototype. They were concerned that users might ignore critical areas in the lengthy checklists which would greatly effect the level of service a youth receives. They decided it was preferable to provide a few attributes with broad implications and ask users to consider all of them. They felt the new method would ultimately make things easier for workers and better programs would be chosen.
The decision not to use the worker's attributes (Comparison prototype) decreased their "buy-in" temporarily. While the ProgramFinder would still help them do their jobs, it had now become the vehicle by which their jobs were being redefined. Resistance to change is often encountered when new systems are introduced.
The decision to use the Initial Customization prototype spawned additional discussions about the set of attributes to be included in the Testing prototype, which was used to perform initial usability testing. Interestingly, after working with management's proposed attributes for a few hours, workers begun to understand and appreciate management's new method. However, they did propose a few new attributes to include in the Testing prototype.
Management requested that the color coding not be included in the Testing prototype because they felt it would unduly bias the selection process. The concern was that workers might just select the highest ranked program (e.g., the one with the "best" color) and not take into account other suitable programs. DJJ wanted to avoid creating a tool that gives the "perfect answer." They wanted the ProgramFinder to narrow down the number of programs and then require the workers to examine each of the remaining programs in-depth.
HCIL's major effort in developing the Testing prototype (Figure 6) was implementation since there was still very little working functionality in the previous prototypes. Slider implementation required the most time. Similar controls are available in the public domain but none had all the functionality DJJ needed.
During testing, several changes were proposed to the prototype. The result was the final prototype design (Figure 2).
Preliminary usability testing was conducted on the Testing prototype. The goal was to give users hands-on experience while HCIL gained valuable design feedback. Testing consisted of two sessions with a total of seven users, a limited but representative group. Each session was divided into four sections: training, testing with representative tasks, filling out a questionnaire, and discussion. Screen mockups illustrating solutions to problems discovered during the first session were presented to users in the second session for their feedback.
Users' reactions were positive overall. They felt that the system would very likely help them select better programs for youths. They also thought the correct amount of information was being displayed which was not surprising since they had been involved in the design from the early stages.
Users did complain that the characters on the screen were too small and that the display was somewhat confusing. The confusion was likely due to their lack of experience with graphical user interfaces and the fact that the selection attributes were new to them.
Additional usability issues emerged during testing, and were addressed by the final design (Figure 2):
1- Addition of Textual Display - Users noted that often the location of a program is not taken into account when placing a youth so we added a textual display (showing a list of programs and their details) as an alternative way to review the best matches. The textual display is better for displaying more details at one time but the map display can provide an overview of all the matches in one screen (without using a scroll bar).
2- Reinstate the "Best" Values - DJJ reversed its decision about color coding with respect to "best" values. Although they were initially concerned that ranking programs might bias the selection process, after using the system they realized the color coding could assist workers when there is no program that matches a youth's needs fully (which is often the case). Assigning "best" values would also provide a clearer picture of what sorts of programs are needed.
3- More Integrated Help - Users found it difficult to remember what the number on the range sliders meant. Each number actually corresponds to a lengthy description that cannot be summarized in a few words. Users in the first session recommended allowing users to make selections from the help facility which contains the descriptions. A sample screen illustrating how this might be done was presented to the users in the second session (Figure 7).
Figure 7. Help facility supporting range selection The new interface met with mild approval but users felt the range sliders would be more convenient once they learned how they work.
4- Attaching Notes - while the workers were using the ProgramFinder, they found that they wanted to record comments about their settings. A small icon above each slider was added that, when clicked, would display the portion of the placement paperwork related to that particular attribute.
5- Modifying Range Sliders - several users expressed difficulty using the range sliders. They were especially frustrated using the sliders when they knew the exact range they wanted. One suggestion was to enhance the range sliders to allow users to select a range by dragging their mouse across the values shown below the slider. This would only require one action as opposed to the two drags required by the standard range slider.
6- Reordering Sliders - the order and categorization of attributes was raised as an important issue. The decision was made to present the controls by workflow and allow users to redisplay them alphabetically if they choose.
After several months of effort and five different prototype designs, we learned several important (and sometimes surprising) lessons that could benefit future developers.
Search attribute selection can be difficult - We initially anticipated that it would be a simple task, but choosing the search attributes required the highest level of effort and caused the most conflict inside of DJJ. The Comparison prototype was developed solely for the purpose of exploring alternative attributes.
Customization increases "buy-in" - We were surprised how much DJJ's "buy-in" increased after the Initial Customization prototype was developed. To us, it was merely a re-implementation of the IVEE prototype for the PCs and the customization added was very minor (a few buttons and scanned forms) but it had a dramatic impact on DJJ's ability to understand how the ProgramFinder could help them and got them to start planning for novel uses.
Interface design can initiate changes in work processes - In the case of the ProgramFinder, the selected set of attributes will significantly change how DJJ selects programs. This temporarily troubled workers but they soon came to understand how it could help them choose better programs for the youths.
Presentation of similar applications stimulates early interest - Even though it is less effective than building a customized prototype, showing "live" demos of similar systems (e.g., HomeFinder) helps focus user thinking and bootstrap management "buy-in."
Creating alternative designs helps engage users - Illustrating functional differences through the creation of several prototypes is a very powerful tool. Users who initially expressed no opinions came forward with strong ideas once concrete choices were presented.
Creating a dialog between users and management early can save time - Meeting with workers and management together earlier in the design process might have eliminated the need for the Comparison prototype. Joint meetings can also help alleviate the "us against them" syndrome.
The level of effort to convert an existing interface technique into a custom design is significant. The entire process of designing the ProgramFinder involved six months of effort and five different prototypes (Figure 8) and there are still issues to resolve.
Figure 8. ProgramFinder Design Process. Line thickness indicates the relative amount of effort and "buy-in" and the length approximates the amount of time involved.
Selecting the search attributes was the most time consuming and conflict generating task. Demonstrating similar applications early on and adding custom workflow hooks to the prototypes increased "buy-in." Alternative designs were presented to increase user involvement. This effort also served as the catalyst for DJJ to redesign their work practice.
We would like to thank Walt Wirshing and Dave Brimm from DJJ for their overall assistance. Additional thanks are due to all the DJJ personnel who took time out of their busy schedules to work with us. The preparation of this report was supported by funding from the Maryland Department of Juvenile Justice.
1. Ahlberg, C., Shneiderman, B. (1993) " Visual Information Seeking: Tight coupling of dynamic query filters with starfield displays," ACM CHI 94 Conference Proc. (Boston, MA, April 24-28, 1994), 313-317. Also appears in Readings in Human-Computer Interaction: Toward the Year 2000, Baeker, R.M., Gruden, J. , Buxton, W.A.S. & Greenberg, S., Eds., Morgan Kaufmann Pubs., Inc., (1995), 450-456.
2. Ahlberg, C., Wistrand, E. (1995) " IVEE: An Information Visualization & Exploration Environment," Proceedings of IEEE Visualization '95 (Atlanta, October 1995), 66-73.
3. Rose, A., Shneiderman, B., Plaisant, C. (1995) " An applied ethnographic method for redesigning user interfaces," ACM Proc. of DIS '95, Symposium on Designing Interactive Systems: Processes, Practices, Methods & Techniques (Ann Arbor, MI, Aug 23-25, 1995), 115-122.
4. Slaughter, L., Norman, K., Shneiderman, B., (1995), " Assessing Users' Subjective Satisfaction with the Information System for Youth Services (ISYS)," Proceedings of Third Annual Mid-Atlantic Human Factors Conference (Blacksburg, VA, March 26-28, 1995), 164-170.
5. Williamson, C., Shneiderman, B. (1992) " The dynamic HomeFinder: Evaluating dynamic queries in a real-estate information exploration system," Proceedings ACM SIGIR '92 (Copenhagen, June 21-24, 1992), 338-346. Also appears in Sparks of Innovation in Human-Computer Interaction, Shneiderman, B., Ed., Ablex (June 1993), 295-307.
[View Figures 3-6]