Email this Article Email   

CHIPS Articles: Icon Testing for Shipboard Information Systems

Icon Testing for Shipboard Information Systems
By Bruce Green - April-June 2004
Introduction

Ease of use for shipboard information systems can be diminished by a number of factors relating to the uniqueness of the hundreds of software products installed on Navy ships. As a project manager in a Software Support Activity for machinery monitoring and maintenance systems onboard Navy ships (gas turbine, air conditioning plants, etc.,), my responsibilities include the full software system life cycle (design, test, acceptance, training, etc.,). One common design flaw that I see is the use of toolbars and icons where linked text would be more appropriate.

During the test phase of a recently completed software development project, I was surprised to see the developer had used an unusual symbol for an icon to represent the merge function. I knew that a Navy user would not be able to connect the merge function with that icon and asked the developer to include text with the symbol. Even proficient computer users will be unable to decipher the meaning of unique, symbolic icons when faced with an unfamiliar information system interface. With this concept in mind, I tested my icon recognition theory with 20 subjects using select icons from two fielded information systems and Microsoft Excel.

Background

The Apple Macintosh computer popularized the use of icons in the mid-1980s. Initially, icons took the image form of trash cans, documents and folders to mimic the physical world of an office. There were no toolbars on the original Macintosh desktop and all functions were chosen from pull-down menus on a menu bar. Later, a toolbar was added to allow common document functions such as New, Open, Save, Print — all accomplished by a mouse click. The original toolbars were simple because the available functions of the software programs were relatively simple. There were no color, charting or integrated draw functions. The original "Save" icon depicted the only save option available on a 1984-era Macintosh — a 3.5-inch diskette. As software functionality increased — the number and size of toolbars also increased. The familiar toolbars used today are the result of nearly 20 years of graphical user interface (GUI) computing work.

Today, the familiar Save icon has not changed even though options have expanded to include saving to hard disk drives and various removable and networked media. Microsoft Office toolbar icons have become familiar to computer users over the course of the GUI computing era. But there are hundreds of unique systems on Navy ships and each has a learning curve for a fleet user. Proficiency with information systems is hampered by the fact that Navy personnel frequently change job functions and commands. Due to the number of unique systems in use and the high turnover rate of users, it is imperative that information systems on ships be as user-friendly as possible.

A common feature of legacy computer systems is overuse of the icon toolbar. Until a user becomes an expert it is unlikely that he or she will remember how to navigate the options of a software product through the use of icons.

Since the GUI computing era began there have been several good studies regarding a user's ability to select the correct icon (Dix1); however, these studies all presuppose that users knew the functions they wanted to select (click) and could match the correct function to the appropriate icon. Shipboard users are often novices of the computer system they are using and must search for the functions they wish to perform. This study investigated whether icons should be used and how they could be improved. (Readers interested in more information on earlier studies can refer to the work by I. S. MacKenzie2 and Robert J.K. Jacob.3)

The Experiment

Twenty Naval Sea Systems Command Philadelphia employees were given an "Icon Usability Test" consisting of toolbar images and descriptive text from three software products. Two of the products are fielded on Navy ships and the third was Microsoft Excel 2000, which is installed on most Navy computers. Each subject answered questions regarding his or her familiarity with computers and the information systems to be evaluated. The test consisted of color printouts of portions of the three toolbars. Directly below each icon a letter designation was added (see Figure 1). Subjects were instructed both in writing and orally by a test administrator to match each icon letter to a short description of a function to which the icon would logically link, for example, "Electric Power Systems Module".

The subjects were instructed that the purpose of this evaluation was to create more user-friendly icons, and they were asked to match functions without using any external data source for help. The subjects were given as much time as they needed to complete the experiment and each worked separately. The correct responses were tabulated by icon and by subject.

General Results

Each of the subjects had to correlate 22 separate icons to descriptions for the two Navy systems tested. This translates to a total of 440 instances of icon decoding. Overall, subjects were successful only 54 percent of the time when trying to match Navy system icons to descriptors. All subjects reported having familiarity with computers and 16 of the 20 subjects reported familiarity with shipboard equipment. There was no statistical correspondence between the score and a subject's knowledge of shipboard equipment. There was a correlation between a subject's familiarity with the information system being evaluated and his or her accuracy.

The first information system evaluated, referred to as Sys#1, had zero of 20 subjects report they had previously used the system. The second system, Sys#2, had 6 of 20 subjects report they had used the system previously. Eighteen of 20 subjects reported they had experience with Microsoft Excel. The icons for the two fielded Navy systems were correctly matched to descriptors approximately half the time. Microsoft Excel icons were correctly matched by 17 of 20 subjects (correlating to 18 of 20 subjects reporting that they had used Excel) 100 percent of the time. Results are shown in Table 1.

Results by Icon Type

Users were able to match icons with descriptors in 100 percent of the responses when the icon contained text that explicitly linked it to the function (see Table 2). The icons that contained text included one with the letters "PMT" which linked to "PMT Query" and the "8 o'clock Reports" icon shown below in Figure 2. These results may seem obvious, yet many icons on shipboard systems are devoid of helpful text. Users were able to match icons that incorporated universal symbols such as a globe for "Global Log Review" and a lightning bolt for electric power source (see Figure 3) with 95 percent accuracy.

In one case, an icon contained text, but the text did not relate to the name of the function, and users were only able to link the icon to the descriptive statement with 65 percent accuracy. In this particular case the icon linked to a software product named "DynaText" and the icon contained the letters "CE" under a magnifying glass. Not surprising that many users were unable to make the leap from "CE" to "DynaText."

Conclusions

Analysis of this limited study reveals that if users can only match icons to the correct function about half the time, they will quickly become frustrated as they search for software links or mistakenly open the wrong modules. This frustration is heightened when the user is busy and trying to complete complicated tasks.

The results of the experiment show that icon symbols have limitations, but an icon that contains explanatory text increases the chances of a user picking the correct software function. Software designers and developers should be aware that users need to be able to easily determine what button or icon will lead them intuitively to the function they wish to perform. Natural language text or commonly used symbols should always be used rather than creating unique symbol-based icons.

References

1. Dix, Alan J. and Brewster, Stephen A. "Causing Trouble With Buttons." Ancilliary Proceedings of HCI'94. Ed. D. England. Glasgow, Scotland, 1994. Available at http://www.comp.lancs.ac.uk/computing/users/dixa/papers/buttons94
2. MacKenzie, I. Scott. "Movement Time Prediction in Human-Computer Interfaces." Readings in Human-Computer Interaction. (2nd ed.) Eds. R.M. Baecker, W.A.S. Buxton, J. Grudin, & S. Greenberg. Los Altos, CA: Kaufmann, 1995. (pp. 483-493)
3. Jacob, Robert J.K. "Eye-gaze Computer Interfaces: What You Look At is What You Get." IEEE Computer, July 1993, Vol. 26, No. 7, (pp. 65-67). Available at http://www.cs.tufts.edu/~jacob/papers/hot.pdf

Bruce Green is a Technical Specialist in the Naval Surface Warfare Center Carderock Division (NSWCCD) Ship Systems Engineering Station in Philadelphia.

Figure 1 shows a row of different icons, including handicapped signs and stop signs.
Figure 1.

Figure 2 shows an icon with a clock and an 8 to represent 8 o'clock reports.
Figure 2.

Figure 3 shows an icon with a lightening bolt.
Figure 3.

Table 1.
Table 1.

Table 2.
Table 2.
Related CHIPS Articles
CHIPS is an official U.S. Navy website sponsored by the Department of the Navy (DON) Chief Information Officer, the Department of Defense Enterprise Software Initiative (ESI) and the DON's ESI Software Product Manager Team at Space and Naval Warfare Systems Center Pacific.

Online ISSN 2154-1779; Print ISSN 1047-9988