Volpe National Transportation Systems Center

Human Factors Evaluation of Electronic Flight Bags

Divya C. Chandra
United States Department of Transportation
Volpe National Transportation Systems Center
Operator Performance and Safety Analysis Division
55 Broadway, Cambridge, MA 02142, USA
+1 617-494-3882
divya.chandra@dot.gov

Abstract

Electronic Flight Bags (EFBs) are small, customizable information-management devices that aid pilots and aircraft operators in conducting flights more efficiently and safely. While the promise of EFBs is great, government regulators, potential customers, and industry developers all agree that EFBs raise many human factors considerations that must be handled appropriately in order to realize this promise without adverse effects. In order to support the development of an Advisory Circular on EFBs, the Federal Aviation Administration (FAA) tasked the Volpe Center to identify EFB human factors considerations. These were documented and reviewed by both government and industry. The next step is to assist the FAA in creating an evaluation procedure for EFBs that is based on the human factors document. The procedure will be designed for use by inspectors to evaluate EFB human factors considerations in the field.

Introduction

Electronic Flight Bags (EFBs) are small, customizable information-management devices that aid pilots and aircraft operators in conducting flights more efficiently and safely. Even a basic laptop computer that has flight-management functions qualifies as an EFB by this definition. Today there are laptop-style EFBs in use during revenue operations at many airlines both in the United States and in Europe (e.g., Southwest Airlines, JetBlue Airways, FedEx, Finnair, and Lufthansa). These EFBs are often used to calculate flight performance and/or view airline documents such as the Pilot's Operating Handbook and Flight Operations Manual. In the near future, EFBs may host a range of other functions, such as electronic approach plates, electronic checklists, surface moving maps, and even cabin video surveillance.

Development of EFBs has accelerated rapidly in the past few years (Carey and Michaels, 2002; Hanson, 2002; Jensen, 2002; Shamo, 2000; Trotter-Cox, 2000). The business case for deploying EFBs considers many types of benefits. In general, however, EFBs are attractive because, relative to traditional avionics, they come at a low initial cost, they can be customized, and they are easily upgraded. Some EFB benefits include reduction in costs associated with data management and distribution, potential reduction in training costs, and even the avoidance of medical costs associated with pilot injuries from carrying heavy flight bags filled with paper. Some airlines are even working directly with vendors to architect EFB solutions for their specific needs.

While the promise of EFBs is great, government regulators, potential customers, and industry developers all agree that EFBs raise many human factors considerations that must be handled appropriately in order to realize this promise without adverse effects. The Federal Aviation Administration (FAA) recently completed an Advisory Circular (AC) for EFBs that brings up some of these issues (FAA, 2002). FAA Aircraft Certification and Flight Standards worked together so the AC covers both issues related to the installation of an EFB system in the aircraft, and its use by the flight crew.

To support development of the EFB AC, the FAA first asked the Volpe Center to identify EFB human factors considerations more than two years ago. The result of that work is a comprehensive document that covers general system considerations, and considerations for three specific EFB applications: electronic documents, electronic checklists, and flight performance calculations (Chandra and Mangold, 2000a). This document was subsequently reviewed and updated significantly. The next step is to assist the FAA in creating an evaluation procedure for EFBs that is based on this document. The procedure will be designed for use by inspectors to evaluate EFB human factors considerations in the field. In this paper, I review the progress and plans made to date on these two efforts.

Return to Top

Considerations Document

EFB human factors considerations for the design and evaluation of EFBs are documented in a report released in September 2000 (Chandra and Mangold, 2000a). The report was reviewed extensively by industry through the Air Transport Association's Digital Data Working Group, so it is expected to be useful to industry designers and customers of EFBs, as well as to FAA inspectors. It contains four substantive chapters: a general chapter on system considerations, and individual chapters on electronic documents, electronic checklists, and flight performance calculations.

The document on EFB human factors considerations is lengthy, but this material needs to be useful to busy people who are not human factors experts and who do not have time to become human factors experts. Therefore, one of the most important features of this document is its structure, which was designed for maximum ease of use. The structure of the document is covered briefly below and discussed in more detail in Chandra and Mangold (2000b), which includes a sample consideration. Recent content updates and plans for the document are also presented.

Structure
The overall structure of the document is visible in the Table of Contents. The first substantive chapter contains system considerations that could apply to any EFB, regardless of the applications that are supported. Each of the remaining chapters covers specific applications. For example, if a given EFB supports electronic documents, then guidance from that chapter should be reviewed, otherwise that chapter can be skipped.

Each chapter contains several "considerations." Each consideration is covered in approximately one page. There are brief guidance statements at the top of the page, and more detailed supporting material at the bottom. References for further information are provided in the supporting explanatory material.

Each guidance statement within a consideration is categorized in two ways. First, it is categorized as being related to Equipment (i.e., the specific hardware or software), Installation (i.e., how the unit functions in the context of an aircraft), Training/Procedures (i.e., how the crew uses the equipment), or some combination of these designations. These categories narrow down the audience that would be interested in that statement.

The second dimension along which each guidance statement is categorized is the type of information it contains. Some guidance statements should be considered as basic functional requirements, while others are recommendations, suggestions, or just issues with complex tradeoffs. 1 The first three categories (i.e., requirement, recommendation, or suggestion) prioritize the guidance statements loosely. High-level requirements point out, in just a few sentences, the most critical risk areas. Compliance with recommendations is highly desirable, whereas compliance with suggestions should be considered, but may not be suitable for every situation. Issues differ from the other types of guidance in that they are descriptive statements, not prescriptive. That is, they point out design tradeoffs, but do not specify a "correct" or "best" solution.

By categorizing the guidance statements along the two dimensions noted above (intended audience and type of information), the structure of the document allows the reader to identify relevant guidance fairly quickly. In this way, the document is designed to be browsed rather than read from cover to cover, so it will be more accessible and useful to a larger audience.

Content Updates
Comments on Chandra and Mangold (2000a), known as "Version 1," were received from industry as well as government reviewers. These comments were incorporated into an updated document known as "Version 2," which contains more detail, more caution, and more clarification of whom the guidance applies to (e.g., commercial operations versus private operations). The increased level of detail in Version 2 is apparent in the additional examples, definitions, and references. More caution is invoked particularly as related to alerts, audio and workload and distraction issues.

A new chapter on electronic charts is also in Version 2. This chapter largely contains recommendations and issues statements rather than requirements, because both research and operational experience with electronic charts is relatively limited at this time.

Plans
The Version 2 EFB human factors considerations document is to be released in fall 2002. The guidance in the Version 2 document will be reconciled with statements from the recently published EFB AC. Open issues will also be resolved or closed as much as possible.

Finally, the plan is to issue not only a paper report, but also an electronic version of the document that is more interactive than the paper document. The electronic document will be distributed via the Web, but, once downloaded, it will run on laptops without requiring real-time access to the Web. The electronic version will support search functions and allow the user to view subsets of the document customized for each of the three audiences (equipment, installation, training/procedures). The goal is to expand the accessibility and utility of the document for all potential users, both designers and evaluators.

Return to Top

Evaluation Procedure

The EFB human factors considerations document has been well received in terms of both format and content. The next programmatic goal is to assist the FAA by creating an operational procedure for evaluating EFBs. This is a practical issue: how should the guidance in the considerations document be incorporated into the EFB evaluation process?

In the next section, I discuss the different types of evaluations that could be conducted on EFBs, with special emphasis on describing the typical regulatory evaluation. Following this, I review specific plans for developing an evaluation procedure for EFB human factors considerations.

Types of Evaluations
There are two different types of evaluations that could be conducted on a new EFB. For this discussion, I will call the first type a comparative evaluation and second type a regulatory evaluation.

In a comparative evaluation, use of the EFB is compared against an older system or against a different way of completing the same task in order to determine which method is better. Often, it is expected that the new system will provide some relative advantage to the user. For example, the experimenter could ask a user to search for a specific piece of information with an EFB that supports electronic documents, or with paper documents, to find whether the user was any quicker, or more accurate, or perhaps more thorough using the electronic document. To see examples of comparative EFB evaluations, see two papers by Shamo, Dror, and Degani (1998 and 1999).

Comparative evaluations can help to predict whether the new system will provide some operational benefit overall, perhaps in terms of safety or efficiency. They often focus on a few key areas where performance is expected to improve with the new system. Their results may also suggest design modifications that optimize usability of the new system. Results of comparative evaluations may eventually influence a potential customer's decision on whether to invest in a new system or not.

In a regulatory evaluation, however, the question is not whether the new system (in this case, an EFB), is any better, but rather, does the system meet its intended function? More specifically, one might expand this question to "Does the system meet its intended function, without introducing any undue difficulty or additional risk?" In other words, does it work, and does it work safely? The system needs to meet minimum operational performance requirements, and the purpose of the regulatory evaluation is to focus on these functional requirements without specifying design solutions. Now the goal is to identify any weaknesses in the new system that create an unacceptable level of risk. Consequently, a regulatory evaluation must be broad and comprehensive. Further, any weaknesses that are found must be clearly identified to the applicant for approval.

One example of a regulatory evaluation procedure is given in Huntley, Turner, Donovan, and Madigan (1995), which is a human factors evaluation guide for approval of standalone Global Position System (GPS) receivers. The book provides a comprehensive set of test procedures for the evaluator to perform, and a place for the evaluator to record his/her observations about the process, including an overall assessment. Requirements and guidelines that are associated with that specific test procedure are referenced at the bottom of each page, with their full text available in the appendices. The evaluator can make subjective observations about the usability of the device for completing that procedure, and then use the reference materials to correct, confirm, or reinforce his/her observations.

There are many advantages to using the type of evaluation procedure given in Huntley et al. First, the evaluator need not be a human factors expert to conduct the evaluation. All the key human factors guidance is contained in the document. Second, by using a pre-constructed comprehensive list of test procedures to perform, the actual evaluation is guaranteed to be quite comprehensive. The work of ensuring that the evaluation is comprehensive is done in advance. Finally, the set of test procedures can be performed in a relatively short period of time. This is important because, in general, an inspector will review a device multiple times over a few months, but each review is limited to a few hours at most.

For the purpose of a creating a regulatory EFB evaluation based on the same model as the GPS evaluation guide, a set of test procedures needs to be designed to cover all the key issues discussed in the Volpe considerations document. In brief, these topics include:

  • Usability of hardware
  • Usability of software user interface
  • Integration of hardware and software with existing flight deck systems
  • Design of training/procedures for EFBs

The evaluation test procedures must also address both the overall system, and any of the specific applications covered in the document that are supported by the EFB.

Plans
In order to develop test procedures for evaluating EFB human factors issues from a regulatory point of view, Volpe is setting up joint projects with industry EFB developers. A variety of evaluations (to be defined) will be conducted over the next year to identify which procedures are simple to perform and highly diagnostic as well. The first project is to work with United Air Lines (UAL) in developing their EFB concept. In parallel with the UAL project, Volpe is working to set up relationships with other EFB developers as well for future evaluations.

The UAL EFB will support electronic documents, a video surveillance tool to identify anyone seeking access to the flight deck, and possibly a moving map for airport surface situation awareness (i.e., during taxi operations). It may also support approach charts and a weather application. United expects their EFBs to be upgradable to host additional applications in the future. Their system concept consists of three components, a display/control unit, a processor box that is mounted separately from the display/control unit, and a data card that would contain data (e.g., electronic documents) and possibly the application software.

The human factors evaluation of the UAL EFB is part of a larger effort to develop their EFB. The human factors components of the project will be conducted in stages over the next several months. First, there will be an initial expert review of a software prototype. After a hardware/software prototype is available, there will be a small data collection where a set of test procedures is carried out to assess the usability of the device; this is the key "practice" evaluation in terms of the effort to develop a regulatory evaluation procedure. Feedback from the usability tests/practice evaluation will go into the construction of a revised software prototype. The updated prototype will again be reviewed prior to installation in a transport aircraft. United plans to conduct an evaluation of the device in their simulators and on an aircraft (outside of revenue service operations) in the spring of 2003.

As mentioned above, the usability test with the UAL EFB is an opportunity to try out a practice regulatory evaluation. Similar to the procedure used in Huntley et al., a small number of users (approximately three to five) will be asked to complete various tasks with the device. The set of tasks will be designed to cover a broad range of the issues discussed in the EFB human factors considerations document. The users will be general aviation pilots who are not experts with EFBs. Their success in completing the tasks, time taken, and number of steps will be recorded. Any difficulties they encounter, and their subjective assessment of the device's usability will also be recorded. There are three reasons for choosing general aviation pilots as users. First, UAL is conducting its own simulator tests with air transport pilots as part of their full effort, so transport pilot experiences will already be considered in the design of the EFB. Second, UAL hopes to validate that the device could potentially be useful to general aviation. Third, the flight experience of general aviation pilots may match that of the FAA inspectors who will be conducting these evaluations.

The UAL EFB will undergo a relatively intensive set of human factors reviews. I expect to learn from the process which evaluation steps are critical, which are diagnostic, and which provide supporting information, rather than unique findings. After the UAL evaluation is complete, the evaluation test procedures will be updated and streamlined for further evaluations.

Return to Top

Summary

If industry development activities are any indication, EFBs are going to be a popular item over the next several years for a variety of reasons. There appears to be enough momentum behind both the technology and the business case that airlines will implement them. Because of their potentially significant effects on the pilots' tasks in the flight deck, however, EFB concepts are being explored with a fair amount of caution from all parties, regulators, developers, and customers alike.

The Volpe document on human factors considerations for EFBs (Chandra and Mangold, 2000a) is a good starting point for evaluating EFB concepts. It is a comprehensive document that covers a wide variety of EFB issues. Guidance in the document will be coordinated with the FAA Advisory Circular on EFBs (FAA, 2002). The structure of the document also lends itself to convenient use by government and industry. The document will be available to a wide audience by distributing it in an electronic format via the Web.

Volpe Center is working to help the FAA develop an operational procedure for evaluating EFBs that is based on the human factors considerations document. The evaluation procedure will be designed to fit within the framework of the typical regulatory field approval process.

The first test case for constructing an evaluation procedure will be for an EFB being developed by UAL. The UAL EFB will undergo a relatively intense human factors review with at least two design iterations. The tests conducted on that device will be critiqued to determine which were diagnostic and simple to perform by non-human-factors experts. With this information in hand, the FAA will be better prepared to handle approval and certification of EFBs.

Return to Top

Acknowledgments

This project was initiated by the Office of the Chief Scientist for Human Factors of the Federal Aviation Administration, headed by Mark Rodgers. Portions of the work are now supported by the Safe Flight 21 Program Office. I would like to thank the FAA Program Managers, Tom McCloy and Marc Buntin, and the many other FAA staff who have actively contributed to the success of this project. Thanks also to Susan Mangold and the Air Transport Association Digital Data Working Group. This project also benefited from earlier work conducted at the Volpe Center by David Osborne, Stephen Huntley, John Turner, and Colleen Donovan.

The views expressed herein are those of the author and do not necessarily reflect the views of the Volpe National Transportation Systems Center, the Research and Special Programs Administration, or the United States Department of Transportation.

Return to Top

References

Carey, S. & Michaels, D. (2002, March 26) At some airlines, laptops replace pilots' brain bags'. The Wall Street Journal. pp. B1, B6.

Chandra D. C. & Mangold S. J. (2000a). Human factors considerations in the design and evaluation of electronic flight bags (EFBs) Version 1: Basic functions. (Report No. DOT-VNTSC-FAA-00-22). Cambridge, MA: USDOT Volpe Center.
View Document (PDF, 458KB)

Chandra, D. C. & Mangold, S. J. (2000b) Human factors considerations for the design and evaluation of electronic flight bags. Proceedings of the 19th Digital Avionics Systems Conference. 10-12 October 2000, Philadelphia, PA.
View Document

Federal Aviation Administration. Advisory Circular AC 120-76. (July 2002). Guidelines for the certification, airworthiness, and operational approval of electronic flight bag computing devices.
View Document

Hanson, E. (2002, June). Electronic flight bags: What airlines want. Avionics Magazine, pp. 32-34.

Huntley, S., Turner, J. W., Donovan, C. M., Madigan, E., (1995). FAA Aircraft certification human factors and operations checklist for standalone GPS receivers (TSO C129 Class A). (Report No. DOT-VNTSC-FAA-95-12). Cambridge, MA: USDOT Volpe Center.
For information on obtaining this report, contact Ray Shih, Telephone: 617/494-2937, Fax: 617/494-3622, E-mail: shih@volpe.dot.gov

Jensen, D. (2002, July) Electronic flight bags: An emerging market heats up. Avionics Magazine, pp. 35-42.

Shamo, M. (2000). What is an electronic flight bag, and what is it doing in my cockpit? Proceedings of HCIAero 2000. 2729 September, Toulouse, France.

Shamo, M., Dror, R., & Degani, A. (1998). Evaluation of a new cockpit device: The integrated electronic information system. Proceedings of the 42nd Human Factors and Ergonomics Society Meeting. Chicago, IL.

Shamo, M., Dror, R., & Degani, A. (1999). A multi-dimensional evaluation methodology for new cockpit systems. Proceedings of the Tenth International Symposium on Aviation Psychology. Columbus, OH: Ohio State University, p. 120

Trotter-Cox, A. (March 2000) Electronic flight bag: Transitioning to a paperless environment requires more than technology. Professional Pilot (Volume 34 No. 3). Alexandria, VA. Queensmith Corporation.
View Document

Return to Top


1. Note, however, that the Volpe document is not regulatory. The regulatory application of this information is the responsibility of the FAA or other appropriate regulatory government agencies. [ back ]