CLEAN > About CLEAN > CLEAN Review Process
Share

CLEAN Review Process

The CLEAN collection is a digital collection of teaching resources aligned with the Climate Literacy Framework and Energy Literacy Framework.

The CLEAN review process is depicted in the figure and described in more detail below.
Diagram of CLEAN Review Process (Click on image to open a larger version.)


CLEAN Review Team:

The CLEAN review team consists of educators and scientists in relevant fields.

CLEAN Review Process:

The CLEAN review process was informed by review guidelines and criteria from other collections, such as the National Science Digital Library (NSDL) , the Science Education Research Center (SERC) Guidelines, the Merlot criteria and the Climate Change Collection. Our review criteria were tested and refined in multiple test review rounds and through review comparisons among different reviewers.

Review Criteria

The following documents list the review summarize the review criteria for the different resource types:

Vetting Step Criteria (Acrobat (PDF) 51kB Aug29 11)

Review Criteria Visualizations (Acrobat (PDF) 60kB Aug29 11)

Review Criteria Videos (Acrobat (PDF) 59kB Aug29 11)

Review Criteria Short Demonstrations/Experiments (Acrobat (PDF) 61kB Aug29 11)

Review Criteria Learning Activities (Acrobat (PDF) 55kB Aug29 11)


Step 1: Collection of teaching materials

The scope and framework of the collection is defined by the Climate Literacy Framework and the Energy Literacy Framework. The target grade levels are grades 6-16.

Our team of CLEAN resource collectors searches educational websites to find existing digital teaching resources that are 1) relevant to CLEAN given the above defined framework of climate and energy science, 2) of appropriate granularity, and 3) for the appropriate grade level. Resources enter the pool either through submission by resource developers or by the CLEAN team members.

Step 2: Formal triage / vetting

Any resource that seems relevant to the collection is entered in our online review tool for further consideration by the review team. Any resource that doesn't meet these criteria of initial relevance and quality at a first glance is not recorded. The initial set of questions in our initial vetting form (Acrobat (PDF) 51kB Aug29 11) addresses the relevance of the resource to the collection (topic, type of educational material, grade level) and concludes with a qualitative recommendation of the overall quality of the resource. See the initial vetting form (Acrobat (PDF) 51kB Aug29 11) here .

Step 3: Reviews

Any resource that passes the initial triage is subject to at least two rounds of general review by the CLEAN resource collection team (educators and scientists), followed by a review panel and a final expert science review.

3.1 General reviews

The core piece of the CLEAN review is a set of review questions divided into three different categories: a) Scientific accuracy; b) Pedagogic effectiveness and c) Technical quality / Ease of use .

The reviewer answers the review questions, gives an overall rating for each of the three categories mentioned above, and notes the strengths and concerns for each resource. An overall qualitative recommendation (low, medium, or high priority) decides which path a resource takes through the review process. These questions help the reviewer to think about all key aspects of a resource. The phrasing of the questions reflects best practices for the development of teaching materials.

There are initially two rounds of this general review by two different reviewers.

The science review section is filled out by every reviewer. Our team members are sufficiently experienced to decide whether or not the science that is presented seems solid; however, an expert science review follows (see below). Scientific considerations include questions about attribution, validity of concepts, supporting references, and background materials.

The core piece of this review form addresses the pedagogic effectiveness of a teaching activity - this section is much less important for the other resource types. In this section, a special focus is the appeal of the resource to a diverse audience. Considerations of pedagogic effectiveness include learning objectives, learning styles targeted, prerequisite skills and understanding, assessment strategies, level of engagement of students, and whether an activity is inquiry based.

Another important piece is the usability and technical quality of a resource. Considerations in this category include readiness for use, clear presentation of content, materials commonly found in classroom, amount of necessary guidance for students by instructor, and the presence of a teacher's guide (for activities).

Review Criteria Visualizations (Acrobat (PDF) 60kB Aug29 11)

Review Criteria Videos (Acrobat (PDF) 59kB Aug29 11)

Review Criteria Short Demonstrations/Experiments (Acrobat (PDF) 61kB Aug29 11)

Review Criteria Learning Activities (Acrobat (PDF) 55kB Aug29 11)

3.2 Review panels

All teaching resources that pass through the steps outlined above are presented to a panel of four reviewers (educators and scientists) during a review camp. This team of four specialists discusses each resource, based on the prior reviews, and makes the final decision about inclusion in the CLEAN collection.

3.3 Expert science reviews

The quality and accuracy of the scientific content of each resource is checked by an expert in the respective field, in addition to the science review that is done during the first review rounds. The scientist is asked to keep the target grade level of the activity in mind when judging the scientific quality, because simplifications are often necessary for complex science topics in resources for younger students.

The scientists fill out the science review form (Acrobat (PDF) 67kB Oct18 10) where they are asked to judge the quality of the activity and give suggestions about how to improve the science content, if necessary. If they feel that scientific shortcomings can be addressed in annotations (notes to the user), we include the resource and post the scientific comments on the public display of the CLEAN collection. However, if a scientist has major concerns about a resource, it will not be included in our collection.

The scientist also decides if a resource covers cutting-edge science and needs to be revisited regularly to check if the science content is still current.

3.4 Annotations/Notes to the user

All comments of the reviewers are compiled into annotations (notes to the user) on the science, the pedagogy, and the usability of a teaching activity. Tips on how to use this resource in a classroom are included as well, and all this information is provided on the public display of each teaching resource in the CLEAN collection.

Step 4: Cataloging and alignment with educational standards/ benchmarks/ guidelines

The CLEAN team developed a set of terms ("vocabularies") that define the topics of climate science, climate change, and energy awareness. Each resource is tagged with these terms so that a thematic search of the collection is easy for the user.

The CLEAN collection is aligned with the Climate Literacy Framework at the concept level as well as with the Energy Literacy Framework.

The CLEAN collection is also aligned with the Benchmarks for Science Literacy (AAAS Project 2061). The alignment effort was done manually, based on a suggestion by a AAAS Project 2061 specialist on the alignment of the Climate Literacy Essential Principles with the Benchmarks.

An alignment with the Next Generation Science Standards is in progress.

i) We have examined only part of the climate education landscape so far.

Our Timeline:

  • We have reviewed thousands of digital teaching resources and those we selected since the start of the project in 2010 CLEAN collection.

ii) The review process is open.

How we get resources:

  • We search hundreds of websites containing educational materials.
  • Anyone is welcome to submit a resource for consideration through the "Suggest a Teaching Resource" form.
  • If developers have larger curriculum or module resources they would like us to review, we are happy to discuss the process and potential fits with the CLEAN collection.

iii) We are looking for input and participation.

How to get involved:

  • We are looking for additional resources and are happy to talk with developers about the gaps in our collection to help guide development.
  • If you would like to be involved as a reviewer, please sign up as a science reviewer or contact us for opportunities to do pedagogic reviews.
  • CLEAN is intended to be a resource for the community. Send us your resources, tell us your thoughts on the project and our review process, talk with us about gaps in our collection, mention submitting resources to CLEAN in your proposals, be a reviewer, suggest reviewers. Find all information and contacts here.

iv) The review process is transparent.

How to learn more:

  • The fact that a resource didn't make it into the collection is often a question of alignment and granularity, not quality.
  • Review comments for the accepted resources are available online with the resource description.
  • Reviews for resources that were not included in this round are available to their developers, upon request.

« Previous Page      Next Page »