Institute of Museum and Library Services
site search 
Home    Press Room    Related Links    FOIA    RSS    Contact Us
Grant Applicants Grant Reviewers Grant Recipients Library Statistics State Programs Resources News & Events About Us

Available Grants –
by Grant Name

Available Grants –
by Institution Type

Available Grants –
by Project Type

Eligibility Criteria

Sample Applications

Submitting an Application

After You Apply

Outcome Based Evaluation

• 
•  Webography
•  OBE Resources
Grant Applicants - Outcome Based Evaluation

Purposes of OBE

IMLS believes the two most important purposes of evaluation are (1) to provide essential information for good decisions about priorities, deployment of resources, and program design and (2) to help communicate the value of initiatives (whether these are programs, services, or organizations– like libraries and museums).

The first step in choosing an evaluation method is deciding why to do it. Here are some good reasons:

  • know the extent to which you’ve met your project or program goals;
  • know the progress you’ve made towards large or long-term goals, and what’s still needed;
  • know the quality of your program or service (you define “quality” for the purpose of an evaluation–quality can include efficiency, productivity, cost control, effectiveness, value to a community, or a variety of other values);
  • know if your program warrants more resources, fewer resources, or no resources at all (should continue, expand, or cease);
  • communicate the importance of your program, service, or initiative to potential users, policy makers, and/or resource allocators.

This list is not exhaustive. You may want evaluation to meet all of these needs and more. The more purposes for evaluation, the more thought you need to give its design, and the more complex and expensive it will probably be. Few organizations can afford to cover all these bases. Your choices control scale and cost.

This table shows the four most common categories of messages about libraries or museums with some of the models for collecting and understanding information that typically support them. In order of increasing importance to most decision-makers outside the library and museum communities they are:

Message Information Strategies for Understanding Museum and Library Performance*
How Much We Do Inputs and outputs: statistics, gate counts, Web use logs, and other measures of quantity and productivity
How Well We Do It Customer satisfaction, quality benchmarks, rankings
How Much We Cost/What We’re Worth Return on investment and cost:benefit calculations
What Good We Do/Why We Matter Outcomes measurement, impact assessment

See the Webography for examples of these approaches in the library and museum contexts. All of these messages and approaches (and others) can be valid. The best evaluation strategy depends on:

  • the most important things that you want information to help you do or show,
  • who you hope will use the information,
  • how you want them to use it, and
  • what you can afford or are willing to do.

Once you make those choices, identifying an evaluation approach; choosing methods, instruments, and samples; and developing specs, creating an RFP, or choosing an evaluator are much, much easier.

  back to top
 
 
Grant Applicants   Grant Reviewers   Grant Recipients   Library Statistics   State Programs
Resources   News & Events   About Us   National Initiatives   Grant Search   Press Room
Related Links   Contact Us   Privacy Policy   FOIA   Get Plug-Ins