OFFICES


OPE: Office of Postsecondary Education
Current Section
Lessons Learned from FIPSE Projects IV - May 2000 - West Virginia University

Development of a Computer System to Educate Students to Evaluate and Interpret Published Drug Studies

Purpose

Despite the efforts of peer reviewers and editorial boards, it is not unusual for errors in research design and inconsistencies in reporting to surface in professional pharmaceutical and medical journals. If patients are to receive appropriate treatment, it is essential that medical professionals be capable of thoroughly and accurately interpreting the primary literature on drugs.

Nevertheless, the education of students in the techniques of appraising drug studies has typically received short shrift in many health sciences curricula. Often, the checklists, scoring systems and algorithms available to assist medical personnel to evaluate clinical drug studies are brief and do not adequately explain the concepts involved.

Faced with the need to teach students to analyze drug literature, project faculty reasoned that it would make sense to use computer assisted instruction (CAI), because the computer allows for numerous examples and levels of explanation that students can access and review as needed. Faculty also believed that CAI would promote active learning in a curriculum that students mastered mostly by listening to lectures and memorizing facts.

Innovative Features

Faculty developed a self-contained interactive CAI program, entitled "Evaluation of Clinical Drug Studies," initially to be used in conjunction with a one-credit course, Introduction to Drug Literature. However, the program has also been used as an independent two-credit course. The program teaches students to analyze and evaluate all aspects of published clinical drug studies, with a focus on drug efficacy. It can be used to replace class lectures. The program features sound as well as graphics and includes presentation of material, examples, questions, and responses. Students can choose to review prior lessons or progress to new ones.

The project started with a three-day training offered by the manufacturer of Authorware Professional 7 (Macromedia, San Francisco, Calif.), the software chosen for the program. This user-friendly software only requires knowledge of computer basics and allows the incorporation of sound, graphics and animation.

Because the "Evaluation of Clinical Drug Studies" material had not been taught before, the team of faculty developers first had to outline the course and prepare the text. Three consultants and a senior pharmacy student reviewed the CAI materials at different stages in their development.

The program consists of 16 sections and subsections on topics such as experimental design, study settings and patient factors, measurements, statistical concepts and analyses, and data handling. Students were given basic instruction on the computer and then assigned in groups of two or three to a computer. Each group received a different published clinical drug efficacy study to review as part of the course. Six exercises during the semester asked questions about the study and helped faculty evaluate students' comprehension of the program. Students were expected to cover one section of the program per week on their own. About 15 minutes a week of class time were initially allotted for questions and discussion of the program.

Evaluation and Project Impact

A pretest and a post test were administered to a control group and a test group of students.

The control group consisted of 77 pharmacy students taking Introduction to Drug Literature the year before CAI was introduced. These students had had no formal instruction in drug study evaluation. The pretest was designed to measure their understanding of important characteristics of clinical drug studies and their ability to analyze and evaluate them. A similar post-test was administered at the end of the course.

The test group consisted of 76 students enrolled in Introduction to Drug Literature with the CAI component. The pre- and post-tests were almost identical to those given to the control group.

Both groups had similar age and GPA profiles. Slightly more students in the control group had prior experience with statistics or research methods than those in the test group. Although this was expected to yield better pretest scores on the part of the control group, the test group's scores were higher by two percentage points.

The mean post-test scores of the test group showed a significant increase over the pretest scores-from 43.6 percent to 77.4 percent. The post-test scores of the control group unexpectedly declined. This proved to be the result of anomalies in the administration of the test. When corrected for these anomalies, the resulting scores were quite similar-43.8 percent in the pretest versus 42.2 percent in the post-test. Similar tests administered to a later class of CAI users exceeded the first year's findings: students scored an average of 42 percent on the pretest, and 83 percent on the post-test.

Faculty had estimated that the CAI program would take about 15 hours to complete Students, however, turned out to be less assiduous than expected, spending a total of 11 hours on average, with a minimum of four hours and maximum of 25 hours. Almost all students skipped at least one section, and nine percent skipped more than half of the program.

At first glance, only a weak correlation emerged between the total time spent on CAI and the post-test scores. However, there was a considerable difference between the time spent on the CAI program by students with the highest versus those with the lowest post-test scores. Those with scores of 90 percent and above spent on average at least ten hours more time on the CAI program than the students with the lowest scores.

Student attitudes toward the program were measured by four surveys administered over one semester. Despite complaints that they did not receive enough credit given the additional load represented by the program, students generally felt that the information they were learning was new, of high quality, and geared to the appropriate level. They liked the CAI screen design and the way in which information was presented. They were almost equally divided on whether they preferred learning through lectures or through CAI. Despite the agreement among pharmacy educators about the need to learn to evaluate drug studies, some students felt that the program, although appropriate for physicians and researchers, was not necessary for them.

Lessons Learned

Although Authorware7 is relatively easy to use, it took faculty longer than expected to master it. That and a number of contretemps such as having to re-size each screen to match the monitors of newly-acquired computers caused faculty to put in approximately 50 hours of preparation for every one of the program's 15 hours. An instructional designer would have shortened the process considerably, by assisting faculty with screen layouts, font and color selection, and proper use of the software.

Project Continuation

The CAI program continues to be offered as a large part of the Introduction to Drug Literature course, which now carries two credits. Students spend approximately one hour per week on the program and one hour in classroom activities, a portion of which is devoted to exercise review and question and answer sessions related to the CAI material. The program is now available in a Macintosh and Windows version and can be accessed via the Internet.

Dissemination and Recognition

Version 2.0 of "Evaluation of Clinical Drug Studies," which is copyrighted, has been distributed to a number of institutions. It is being used in West Virginia University's baccalaureate and doctor of pharmacy curricula, and has been used for distance instruction at Oregon State University, for distance and on-site instruction at Duquesne University, and for pharmacist training in literature evaluation techniques.

The subject of a number of presentations (to the American Association of Medical Colleges and the Symposium on Computer Applications in Medical Care among others) and publications, the program received one of the first three annual "Innovations in Teaching" awards from the American Association of Colleges of Pharmacy.

Project faculty obtained a grant of $5,000 from the pharmaceutical manufacturer Astra Merck to work on a condensed practitioner's version of the program. An expansion to include critical evaluation of pharmacokinetics studies has captured the interest of the Drug Information Association, which should yield opportunities to use the program in government agencies and the pharmaceutical industry.

At West Virginia University, the recent conversion to a six-year pharmacy curriculum will place more emphasis on clinical expertise-and therefore on the need to critically analyze drug studies-than was the case in the past. An additional FIPSE grant to institute computerized problem-based learning (PBL) at the school of pharmacy should also help to continue to change student attitudes toward the use of computer instruction.

The latter grant is designed to integrate the educational methods of PBL and concept mapping with computer technology, to use the advantages of each. Ten multidisciplinary PBL cases-including concept maps for case objectives-have been developed and field tested in medicinal chemistry, pharmaceutics, and clinical sciences. The cases are currently in their second year of use and evaluation.

Available Information

For additional information, contact:

Marie A. Abate
West Virginia University
School of Pharmacy
P.O. Box 9520
Morgantown, WV 26506-9520
Telephone: 304-293-1463

[Tufts University: Detailed Evaluation of a Novel Approach to Curricular Software] [Table of Contents] [IV. Curriculum and Instruction]

Top

FIPSE Home


 
Print this page Printable view Send this page Share this page
Last Modified: 09/10/2007