Department of Health and Human Services logo
Questions? Questions?  
Privacy Privacy  
Site Index Site Index  
Contact Us Contact Us  
  Home   |   Services   |   Working with ACF   |   Policy/Planning   |   About ACF   |   ACF News  
Administration for Children and FamiliesUS Department of Health and Human Services
ECLKC Home
Connecting * Sharing * LearningConectar * Compartir * Aprender
Printer FriendlyEmail A FriendSuggestions
Connecting * Sharing * Learning Conectar * Compartir * Aprender
            Have a question?
Go
Navigation Menu Top
Navigation Menu Bottom
Quicklinks
 
 

Screening and Assessment in Head Start
 

Initial screening of children, as mandated by the Head Start Program Performance Standards, is carried out to identify evidence of developmental, sensory, or behavioral disabilities. Teachers and disabilities specialists will value this discussion on the process to identify strengths, learning needs, and support that a child requires. Key concepts and guidelines are discussed in this issue of the journal. 

The following is an excerpt from
Head Start Bulletin

Screening and Assessment in Head Start
 

Table of Contents


 

Screening and Assessment in Head Start
By Tom Schultz, Head Start Bureau

How are our children doing? This question stays on the mind of every Head Start and Early Head Start staff member, manager, parent, and community partner. It is the key question we seek to answer in our national research and evaluation studies. And it led the Congress in 1998 to enact new legislative mandates that Head Start programs are beginning to implement to assess children's progress towards specific learning outcomes, and to analyze and use information on child outcomes in their local program self-assessment process.

How are our children doing? A major way we begin to answer this question is through initial screening and ongoing assessment of every child in Head Start and Early Head Start. As mandated in the Program Performance Standards, initial screening of children is carried out to identify evidence of developmental, sensory, or behavioral concerns and to determine if children should receive a more formal evaluation to identify disabilities. Ongoing assessment is also required for each child to identify his strengths and needs, to help tailor learning experiences and other services, and to support staff in communicating and working with parents and families.

This Bulletin provides a wide variety of ideas and strategies on initial screening and ongoing assessment; connections between assessment, curriculum and individualization; and ways to implement new policies on assessing and analyzing information on child outcomes in your program. Authors from the Head Start Bureau, research projects, state government agencies, and local Head Start and Early Head Start programs contributed the following articles to help you think about and work on new ways to improve your program-

  • We begin with an article, by Judy Jablon and Amy Dombro, experts in early childhood education and assessment, that reinforces the central importance of staff becoming skilled observers of children and using their observations to enhance children's learning and development.
  • Four articles describe efforts in local Head Start agencies to improve screening, assessment, and linkages with program curriculum. Jan Greenberg tells how East Coast Migrant Head Start created their current screening and ongoing assessment approach-and how they are planning to meet new mandates to use child outcome data in program self-assessment and continuous improvement. Leaders from Head Start programs in Jackson, Michigan, and Seattle, Washington, describe efforts to improve and link curricula and ongoing assessment efforts. Larry Schweinhart and Ann Brown describe how the High/Scope Child Observation Record system is being utilized in Kalamazoo, Michigan, as the foundation for an ongoing child assessment system.
  • Jim O'Brien of the Head Start Bureau provides an article on how initial screening and ongoing assessment successfully supports children with disabilities in Head Start and Early Head Start.
  • Leaders from Early Head Start programs in Vermont, Delaware, and Missouri provide accounts of efforts to involve parents as integral partners in the assessment process; to use a research-based tool to assess the quality of classroom environments to complement child assessment and internal program monitoring efforts; and to use an assessment tool to improve the quality and outcomes of services by home visitors to children and families.
  • Head Start Director Gayle Cunningham shares her perspective on and lessons learned from participating in a Head Start Quality Research Center Study. A summary of recent HSQRC findings accompany her interview.
  • Two articles describe statewide collaborative efforts by Head Start grantees in Rhode Island and Ohio to develop common approaches for assessing and analyzing information on child outcomes.
  • The Head Start Child Outcomes Framework, which accompanied the recent Information Memorandum on Using Child Outcomes in Program Self-Assessment (IM-00-18), is reproduced as an additional resource for readers.
  • Dollie Wolverton of the Head Start Bureau provides an overview of the National Head Start Child Development Institute which provided more than 3200 Head Start managers with a week-long professional development experience keyed to the goal of fostering and assessing progress on child outcomes and school readiness.

How are our children doing? This question was a passionate concern for Helen Taylor, Associate Commissioner of the Head Start Bureau from 1993 until her death on October 3, 2000. Helen worked tirelessly to strengthen program quality, increase funding, and enhance accountability. She believed wholeheartedly in the importance of Head Start agencies using state-of-the-art screening and assessment methods and using assessment to improve curriculum, enrich learning experiences, and engage families as partners. She recognized the challenge of demonstrating accountability for child outcomes in new ways as critical to Head Start's future.We dedicate this Bulletin to her memory.

Tom Schultz is the Director of the Program Support Division, Head Start Bureau.T: 202-205-8323; E: tschultz@acf.dhhs.gov.

Go to top


 

Key Concepts

Screening

The screening process is the preliminary step used to determine if sensory, behavioral, and developmental skills are progressing as expected, or if there are causes for concern or a perceived need for further evaluation.The screening itself does not determine a diagnosis or need for early intervention. However, it may suggest the need for an in-depth evaluation that can make those determinations. To ensure that children with special needs are identified early, Head Start requires screenings to be conducted within 45 days from entry into the program. Screenings are not one-time events-if a child is suspected of having a developmental delay later in the program year, a referral is made for a formal evaluation at that time.

Assessment

Assessment is an on-going process to determine a child's strengths and needs. It also assesses the family's strengths, needs, resources, concerns, and priorities. Information from the assessment is used to determine strategies to support the development of the child within the context of the classroom as well as his family, culture, and environment. Assessment is both a formal and informal process.The formal process includes the use of published developmental tests, checklists, or structured observational procedures. Informal assessment includes discussions with parents or caregivers and casual observation of children engaged in their daily tasks.

Formal Evaluation

A formal evaluation is performed by a qualified professional to diagnose a developmental, sensory, or behavioral condition or disability requiring intervention. Most children will not be referred for formal evaluations-only those children identified through the screening and ongoing assessment processes as suspected of having a condition or disability that might require intervention. The Early Intervention/Part C agency or the Local Education Agency in the community must be notified of a child who needs formal evaluation to determine his or her eligibility for early intervention, special education, and/or related services as called for by the Individuals with Disabilities Education Act (IDEA). When formal evaluation determines that a child does have a disability, programs work with families and local partners to develop an Individualized Family Service Plan (IFSP) or Individual Education Program (IEP) to address needs identified by the formal evaluation.

Go to top

 

 

Using What You Learn From Observation: A Form of Assessment
By Judy Jablon and Amy Dombro

Every decision you make about the environment, daily routines, and learning opportunities in your classroom affects children's learning. By assessing children's learning through ongoing observation, you gain insights into children's strengths, knowledge, interests, and skills.You discover barriers that may be inhibiting their success. You reflect on daily life in your program and make adaptations that enable children to overcome obstacles and build on what they know and do well. By using what you learn from observation, you can foster each child's competence and success and create and maintain a high-quality program for children and families.

Some people think of assessment as an end point-something you do to prepare a report for families or to meet a program's requirements. Actually, assessment should be used as an ongoing process to answer questions about children's growth and learning, and to find ways of supporting their development.

Assessing to Find Answers About Individual Children

There is always something new to learn about a child-even children you think you know well. If you make a habit of asking questions, you will get to know who a child is and can keep track of who that child is becoming. Asking specific questions can provide a focus for observations and lead to solutions. You have repeated opportunities to witness children practicing skills, demonstrating knowledge, and exhibiting behaviors in a familiar and comfortable environment. Not only can you observe what children know, but also how they think and solve problems. By collecting observations, you can find answers to your questions and build a picture of children's performance and progress without interfering with their daily activities or usual behavior.

For example, when Laura, an infant caregiver, senses something amiss with five-month-old Kara's fine motor development, she refers to the observational checklist she uses to monitor children's development. Based on her observations, she realizes Kara is not bringing both hands to midline, while Taylor, another child the same age, does so frequently. Laura continues observing and decides to talk to the physical therapist who consults with the program to request activities to help Kara reach this milestone.

To assess four-year-old Kathy, the teacher photographed Kathy and Josie playing together in the block area. Several days later, he made some notes about the conversation Kathy was having with another preschooler. On yet a third occasion, he saved a painting Kathy made with Josie. When it is time to evaluate Kathy's performance and progress, her teacher's judgments about her growing ability to interact with her peers will be based on these and other observations.

These examples illustrate how day-to-day assessment of young children can help monitor their development and learning and help you make meaningful decisions about how to support their continued progress.

There is always something new to learn about a child-even children you think you know well. If you make a habit of asking questions, you will get to know who a child is and can keep track of who that child is becoming.

What are some of your questions about the children in your care? Observing can help you learn about a child's:

  • Health and physical development. What kinds of large motor and small motor activities does the child prefer? How does the child manipulate scissors and crayons? Does the family have concerns about the child's health?
  • Temperament. Can a child generally be described as flexible? Slow to warm up or fearful? Feisty or intense?
  • Skills and abilities. What does the child do well? What does the child find challenging? What skills is the child trying to achieve?
  • Interests. What activities cause a child's eyes to light up? What does the child talk about? When given a choice, what does the child choose to do?
  • Culture and home life. How does the child express cultural or family traditions during play? How is discipline handled and affection expressed at home?
  • Approach to learning. How does the child approach new activity? How would you describe the child's interaction with materials?
  • Use of verbal language. How much language does the child have? Does the child talk to other children? Other adults? What does the child talk about?
  • Use of body language. How does the child move? Does the child use gestures? Is the child physically expressive?
  • Social interactions with adults and peers. Does the child interact with other children? How does the child initiate interactions? How does the child handle conflicts?
  • Cognitive skills. Does the child show interest in books and other print material? Does the child notice similarities and differences?

Assessing to Inform Decisions About Programming and Teaching

Observing and reflecting lead to insights and interventions that work. You can apply what you learn from observations to modify your program in order to adapt your environment, daily routines, and teaching strategies. At the end of the day, Karlene, an infant caregiver, reflects on what she has seen this past week:

Over the past three days, Lynn, age 7 months, has been getting up on her hands and knees and rocking back and forth. Today, she put one hand in front of the other, moved a knee forward, rocked slightly back and then crawled for several feet.

We are always mindful of safety. Since we have a child starting to crawl, I will crawl around the floor and look for potential hazards. That way we'll be able to let Lynn freely explore the room.

Periodically observing daily routines ensures that they get the same attention and planning as all the other valuable learning experiences in your program. Jeff, a preschool teacher, observed rest time was becoming difficult especially with Nicholas, age 4 1/2. As Jeff writes at the end of the week:

Nicholas whines when I dim the lights and say it is time for a rest. He tells me, "I want to play, not sleep." On Tuesday, he laid down on his mat for a few minutes and began fidgeting. Soon he rolled off his mat and onto his neighbor's.

My solution has been to adapt rest time by letting Nicholas-and other children who don't sleep or nap-bring a quiet work activity with them to their mats, such as paper, crayons and books. This seems to be working.

By observing, you learn about children's interests, strengths, and experiences. You can use this information to individualize instruction for the children in your program. A preschool teacher notes:

Leticia, age 3, whose home language is Russian, rarely speaks in school. One day we were talking about pets and Leticia didn't say a word. But the next day, she and her mom came to school with Leticia's guinea pig from home.

I found out Leticia understands more English than I thought she did. I knew I had to build on this to help her feel more comfortable speaking at school. So, we wrote a story about Tiger, her guinea pig. Leticia worked on an illustration of what Tiger eats. I always have a camera on hand for moments like this so I took some photos of Leticia, her mom, and Tiger. I gave one photo to Leticia to take home and kept some in our class photo album. After this, Leticia began talking more to me and other children.

Assessing to Understand Challenging Behavior

Every teacher struggles with challenging behavior. Careful assessment of young children can give you the clues to address discipline issues. Asking questions, looking for strengths, and enlisting the support of families in positive ways can benefit everyone involved. This case study of Denise illustrates how one teacher used these strategies successfully.

Regina, an excellent classroom manager and usually quite resourceful in finding ways to support children, did not know how to respond to Denise, a preschooler in her classroom. She explains:

When we're sitting in circle, Denise doesn't seem to understand what is going on. She doesn't follow directions. I'm continually telling her to settle down and to stop talking.

I decided to begin recording mostly positive behaviors-for myself and to share with Denise's grandmother, who has had more than her share of people complaining about Denise's behavior. I thought by building our relationship and strengthening the relationship between Denise and her grandmother, she would get the support she needs at home and in school.

At first Regina had to work hard and look carefully to find something to write about. Over time it became easier. Here are a few observations she recorded and sent home:

During a group discussion about favorite foods, Denise looked around and fidgeted as she waited for a turn to speak. At her turn, she said her favorite food was blueberry pancakes. She said she could eat 100 of them. She smiled when three other children agreed.

Denise's face tightened when another child crumpled the edge of her painting. She moved her hands as if to pinch him. Then she looked over and called me for help. I asked what happened. Paul explained he crumpled Denise's painting by accident when he hung his painting up on the drying line. He told her he was sorry. She smiled and said, "That's OK" to Paul.

Regina has used her observations not only to build her relationship with Denise, but also to strengthen Denise's relationship with her grandmother, turning grandmother into an ally supporting Denise-at home and in school. Regina explains:

Denise is starting to feel better about herself. She beamed and told me her grandmother is proud of her. Denise's grandmother has called me to say how much she appreciates the positive notes. I have come to care for Denise and the way she grabs life so fully - even though that means she may disrupt circle time.

Assessing to Foster Each Child's Competence and Success

Assessment can help teachers make good decisions about how to intervene in ways that support each child's success as a learner. As you get to know children and your respect and appreciation for them grows, it is more likely your decisions about how and when to intervene will be based on their interests and needs. This is the essence of individualizing.

Sometimes the best thing you can do to support a child's learning is to step back to let the child experience something-even if that means the child will take a risk or make a mistake. Taking a few moments to observe a child at play or work may be just what you need to figure out if you should stay out of the action. When you do step in, rely on your observations to guide you. Ask the right questions, make the appropriate comments, or offer materials that will stimulate and stretch the child's thinking.

The chart on this page shows examples of decisions teachers might make based on their knowledge, appreciation, and respect for the children under their care. The next time you observe children, think of a question you can ask about a child or how you might intervene to support a child's success.

Chart

Child's Age: What You Observed

6-Month-Old Child: Babbles back when you talk with him

What You Might Decide To Do:

  • Note his language development and desire to communicate by describing to him what is happening during his daily routines, such as diaper changing and mealtimes.
  • Pause to let him respond through sounds and gestures.

Child's Age: What You Observed

22-Month-Old Child: Cries lately when her grandmother leaves in the morning.

What You Might Decide To Do:

  • Be available to support her when it is time for grandmother to say good-bye.
  • Show respect and let her know she can share her feelings with you by listening to and acknowledging her feelings.
  • Show her the picture of her family hanging on the wall.

Child's Age: What You Observed

3-Year-Old Child: Told about making dumplings with her parents over the weekend.

What You Might Decide To Do:

  • Provide cultural continuity by talking about foods children eat at home during lunchtime conversation.
  • Add books with pictures of foods from different cultures to the library corner.
  • Invite Baili's parent(s) to prepare dumplings or another favorite dish with the children.

Child's Age: What You Observed

4-Year-Old Child: Arguing with Edward about who is taller.

What You Might Decide To Do:

  • Observe if they can problem solve on their own. (In a few minutes, Sarah gets a ruler to measure Edward.)
  • Make a growth chart with the class to mark their changing heights.

Child's Age: What You Observed

5-Year-Old Child: Built a barn complete with stalls and a milking machine in the block area.

What You Might Decide To Do:

  • Ask him to talk about how he helps his older brother milk the cows in the barn.
  • Reinforce what he already knows by hanging up pictures of the interior and exterior of barns in the block area.

We encourage you to conduct ongoing assessments. Everyone will benefit. Your work will be much more satisfying as you ask and answer questions about teaching and learn ing. Your relationships with parents also will be enriched by the stories you share with them. Finally, you will encourage the development of the children in your care as you create an appropriate learning environment and nurture each child's individuality.

This article is adapted from the book, The Power of Observation, by Judy Jablon, Amy Dombro, and Margo Dichtelmiller (1999. Washington, D.C.: Teaching Strategies, Inc.).

Judy Jablon is an Early Childhood Curriculum and Assessment Specialist and a developer of the Work Sampling System. T: 973-761-4118; E: judyjablon@aol.com.

Amy Dombro is a consultant to infant/toddler and family day care programs and a trainer of Head Start and child care staff. T: 212-928-0545; E: amydombro@aol.com.

Go to top

 

 

The Challenge of Assessing Children: One Migrant Head Start's Story
By Jan Greenberg

East Coast Migrant Head Start Project (ECMHSP), like every other Head Start program in this country, is actively engaged in meeting the challenge of implementing the legislative changes concerning Head Start program and child outcomes. These include, "establishing additional results-based educational performance standards and performance measures, and adapting these standards and measures for use by programs in their self-assessments..." (ACYF-IM-HS-00-18, 8/10/00).

This challenge offers us an opportunity to take a step back and look at what we already are doing to measure child outcomes, and what still needs to be done. It entails reviewing our current screening and assessment system, particularly our assessment tool and process. For many years, ECMHSP centers have used the Denver II to screen all children, and the Early Learning Accomplishment Profile (E-LAP: ages birth to 3 years) and Learning Accomplishment Profile (LAP: ages 3 to 6 years) for ongoing assessment and to track children's progress across a broad range of skills. We are asking ourselves a specific question: Does the E-LAP/LAP provide adequate child outcome information? If not, what other assessment tool does?

This, of course, raises other questions: What criteria will we use to evaluate different assessment tools? How would a change of assessment affect our continuity system? What are the pros and cons of changing our assessment system? If we change our assessment tool, how will that impact programs that integrate the E-LAP/LAP into their curriculum framework?

ECMHSP has established a Child Assessment Committee composed of ECMHSP, delegate agency, and center staff to address these and other issues and questions. This article provides background information about ECMHSP, describes work the committee has accomplished, and explains the questions and issues under discussion.

Who We Are

ECMHSP was established in 1974 to provide continuity of Head Start services to the children of migrant farm workers and their families along the East Coast of the United States. It has evolved over the years from a small, two-center program in Florida, to a multi-state, multi-agency, multi-center operation. Currently, ECMHSP contracts with 20 delegate agencies in 12 states (AL, DE, FL, GA, ME, MD, NC, NJ, NY, PA, SC, and VA). There are a total of 88 centers serving over 8,000 infants, toddlers, and preschoolers in full-day programs.

The majority of ECMHSP children and families are Spanish-speaking families from Mexico, Texas, and Puerto Rico. ECMHSP programs also serve children and families from Haiti, Guatemala, Canada (Mixtec Indians who cross the Canadian border into Maine and work in the blueberry barrens), and the United States.

Programs seek staff members who speak the children's languages. Parents often are hired as teacher's aides for this reason. Many of the families live in Florida from October through May and travel up-stream after the agricultural season is over to work in northern states. Centers open and close with the comings and goings of migrant families, rather than operating on a school year schedule. Thus, ECMHSP programs share children as their families move from place to place to do agricultural work. Many of the children come into Head Start as infants and stay within the ECMHSP system until they transition into kindergarten.

What We Do: Screening and Assessment

Within this context, ECMHSP has developed and implemented a screening/assessment system to provide—

  • Important information about children's competencies and skill development;
  • Opportunities for family involvement and input;
  • Information for use by classroom teachers in individualizing learning activities and creating classroom lesson plans; and
  • A communication and continuity link between all the centers in the ECMHSP system (for E-LAP/LAP).

All children are screened within the first two weeks of enrollment using the Denver II instrument. The first E-LAP/LAP assessment is completed in the next month. The results, along with family input, are used to create individual Child Activity Plans (CAPs) for infants, toddlers, or preschoolers.

The CAP identifies learning objectives and related classroom activities in the following areas of development (similar to the Domains in the Head Start Outcomes Framework): language, cognitive, gross motor, fine motor, pre-writing (only in the LAP), social-emotional, and self-help. Information from the CAPs is used to develop lesson plans for toddler and preschool classrooms. E-LAPs/LAPs are updated monthly, as are the children's CAPs. Programs that are open for eight weeks or less only use Denver II screenings.

ECMHSP uses the E-LAP/LAP as a key part of its communication and continuity system. Classroom teachers assess each child each month and record the information on two identical E-LAP/LAP forms. One copy is provided to families when they inform the center they are leaving; the other goes in the child's records folder, which contains education, health, and family information.

Children's records are sent back to the ECMHSP main office when they leave a center. When families come to the next ECMHSP center, they give their child's E-LAP/LAP form to the classroom teachers. The center also requests the child's records from the ECMHSP main office. This information helps teachers at the new center, as they continue the assessment process and monthly updates. Thus, the E-LAP/LAP form is a communication tool that allows centers to provide continuity of education services as children move.

ECMHSP chose to use the Denver II screening a number of years ago because it met important criteria. It is useful because—

  • Hispanic children are included in the re-standardization;
  • It is a recognized screening tool;
  • There are English and Spanish versions;
  • Training resources are locally available to centers;
  • It covers children with ages ranging from birth to six years; and
  • It can be administered by paraprofessionals.

ECMHSP chose to use the E-LAP/LAP for ongoing child assessment for many of the same reasons. There are English and Spanish versions, it can be administered by paraprofessionals, it includes children with ages ranging from birth to six years, and it is a recognized assessment tool. Both the Denver II and E-LAP/LAP are relatively easy to administer once staff understand the purpose of the tools, the information they provide, and the mechanics of administration.

With such a long-standing and integrated systemwide screening and assessment process in place, reviewing our assessment tool/process to consider change could be a daunting task! However, ECMHSP looks at it as a chance to strengthen our assessment system, reinforce the connections between assessment and curriculum, and measure more accurately children's progress towards established goals and outcomes.

Where We Are

The ECMHSP Child Assessment team has established a course of action. They are gathering information on commercial assessment tools, reviewing Head Start materials on child outcomes, program performance measures and program self-assessment, and establishing criteria and indicators for evaluating assessment tools.

The last activity has entailed quite a bit of discussion to flesh out the indicators for each criterion. So far, our global criteria include staff and training, cost, age-range, play-based, correlation with curriculum, and correlation with the Head Start child outcomes.

One of the criteria-the need to be culturally and linguistically appropriate-requires thoughtful consideration. We ask ourselves, "What do we mean by culturally and linguistically appropriate? How do we determine whether or not an assessment tool is culturally and linguistically appropriate?"

Since all assessment tools and assessment developers are influenced by culture, no assessment is entirely free of bias. Assessment tools measure what is thought to be important to the developer and to the society at large. For example, mainstream American society values competencies in reading and writing. Thus many assessments emphasize related cognitive and fine motor skills. Other cultures value oral traditions and interpersonal relationship skills. Because our programs serve children from diverse cultural and linguistic backgrounds, we have developed the following indicators and questions. These preliminary indicators may be refined as we apply them-

  • Pilot/standardization studies. Was the assessment tool piloted with children similar to our population of children?
  • Availability of tool in other languages. Is the tool available in Spanish? Other languages? If yes, is it a direct translation from the English, or is it an adapted translation (i.e., items assess information similar to the English version, but use words, pictures, and concepts that are culturally familiar and relevant to Latino or other cultures)?
  • Protocol for item administration and interpretation of responses. Do assessors have flexibility in administering items? If a child gives a correct answer, but in his/her home language, is that response acceptable? What kind of latitude do assessors have in interpreting children's responses? Do assessors have to use a prescribed kit of assessment materials, or can they use materials familiar to the children?

Where We Are Going

The Child Assessment Team is ready to begin the work of evaluating selected assessment tools using our criteria and indicators. Since our programs already use the LAP, we will begin with that tool. This will also entail correlating the LAP with The Creative Curriculum used by many migrant programs.

Once the committee has evaluated all the assessment tools, ECMHSP senior management staff will review the information and make an informed decision. They will take into account the impact of any change on our established continuity system, staff training issues, the integration of assessment and curriculum, and measurement of child outcomes as mandated by Head Start.

This is an exciting time for ECMHSP. Our system review undoubtedly will have a profound effect on the educational services we provide. While we already know a great deal about children's developmental and educational status, this work will help us know better where we want our children to go and how to tell when they get there. This is a golden opportunity to create and deliver an even stronger, sounder educational experience for young children-one that will prepare them to become lifelong, successful learners.

Jan Greenberg is the Training and Development Associate at ECMHSP. T: 703-243-7522; E: greenberg@ecmhsp.org.

The following ECMHSP staff contributed to the article: Leila Arjona, Clara Cappiello, Grace Horsman, and Kim Stacy.

Go to top

 

 

Linking Assessment with Curriculum
By Margo Dichtelmiller, Mary Cunningham DeLuca, and Brenda Webster

Developing a curriculum to help Head Start teachers ensure the highest quality education for the young children they serve is an important objective of our large and diverse program. We offer Head Start for over 750 preschoolers, and Early Head Start for 95 infants and toddlers, through the Community Action Agency in Jackson and Hillsdale Counties, Michigan. Preschool program options include half day, full day, extended day, and full year center-based services.

The journey to our current curriculum began in 1997, when we decided we were not satisfied with our curriculum. Like other Head Start programs, we purchased and used several different curricula over the past ten years. However, none fully met the needs of our program and families. As we looked at those available, cost was often a factor as was compatibility with the Head Start Program Performance Standards. It became apparent that what we wanted did not exist. We would have to take the leap and develop our own.

As it turned out, the path to curriculum development was not straightforward. At the same time we were discussing curriculum, we were also struggling with priorities related to assessment: how to develop outcomes, collect data, more fully utilize the Performance Standards and integrate the National Association for the Education of Young Children (NAEYC) accreditation standards into our program.

Assessment was also a concern among our teachers. They thought that our assessment system required too much paperwork. They viewed it as an additional and unnecessary burden. We wanted an assessment approach that met certain criteria: covers all areas of the curriculum; identifies the skills and behaviors teachers need to look for; is a child-friendly approach and can be used during daily classroom activities; and provides information to help teachers make decisions about what to teach. We decided to begin the process of change by looking for a new approach to assessment. After careful review, we concluded that the Work Sampling System met our criteria and addressed our needs.

Learning to Use a New Assessment System

We phased in the Work Sampling System over a two-year period. This strategy proved to be cost effective and allowed staff time to become fully knowledgeable about the assessment system. We believed that by moving slowly, we would have greater success. Margo Dichtelmiller, one of the developers of Work Sampling, trained our education staff. Head Start training dollars and program dollars paid for this training.

The 1998 school year started with staff development focusing on general assessment principles, observation/documentation methods, and introduction of the Work Sampling System. The consultant continued to meet with our educational staff every six weeks. Separate meetings were held with teachers and teacher assistants. At the beginning of each session, the group reflected on their successes and challenges using the assessment system and we encouraged teachers to share solutions to challenges they encountered.

In October 1998, we created a Child Progress Report based on the Work Sampling System Summary Report and trained staff to use it. The report includes space for teacher evaluations of a child's performance and for a short narrative about the child's progress. At the first Parent-Teacher Conferences in November, teachers shared this report with parents. Staff and parents liked this new approach to reporting, which used descriptive language to highlight the child's competencies. Using teacher input, the progress report form has been revised several times to make it as clear and informative as possible.

The following spring, the consultant met with each classroom team for thirty minutes to review their Work Sampling materials. During these sessions, they examined observation notes, developmental checklists, and progress reports and discussed questions and concerns. This approach had several important benefits. First, it allowed staff and consultants to become acquainted and build a level of trust. Second, it provided a safe environment to monitor how well teachers were using the Work Sampling method and to answer questions specific to each classroom. At this point, we focused on using observation to complete the Work Sampling Developmental Checklists. Although our staff were always watching and learning from children, they needed to learn how to make systematic and objective observations in order to make use of the assessment process and materials.

In 1999, the second year, we made one significant change in the staff development program in response to staff feedback. Instead of meeting separately with teachers and assistants, we convened smaller groups of classroom teams. The same workshop was delivered four times so groups of 15-20 had ample opportunity to ask questions. We introduced Portfolio collection, the final piece of the Work Sampling assessment system, during the 1999-2000 school year, but teachers were not expected to use Portfolios until the next school year.

Turning the Focus to Curriculum

As teachers became familiar with Portfolio collection, we concentrated on documenting Language and Literacy goals. This was consistent with the emphasis in Head Start on emergent literacy skills. However, we soon realized many teachers were not familiar with the most recent research on emergent reading and writing. More significantly, they needed concrete ideas for ways to promote literacy growth through interactions with children in developmentally appropriate ways. It was apparent that there was a need to revamp the curriculum for three to five-year olds.

So the new assessment system led us back to curriculum! We developed a preschool curriculum, Planned Play, that reflects the assessment goals and meshed with Head Start mandates.

The curriculum addresses the seven domains of child outcomes identified by the Work Sampling System: Personal and Social Development, Language and Literacy, Mathematical Thinking, Scientific Thinking, Social Studies, The Arts, and Physical Development. These domains overlap with the eight domains included in the Head Start Child Outcomes Framework.

Each curriculum domain has components. For example, Personal and Social Development has five components—

  1. A) Self-concept
  2. B) Self-control
  3. C) Approach to learning
  4. D) Interaction with others
  5. E) Social problem solving

Within each domain, curricular components are represented by several indicators for four-year-olds taken from the Work Sampling Developmental Checklist. These become child objectives.

For each indicator, we have identified specific behavioral expectations for children, based on the rationale and examples in the Work Sampling Developmental Guidelines, teacher knowledge, and classroom experience. Expectations for children are also based on what we expect children to do by the end of their participation in Head Start, before making the transition to kindergarten. In some cases, we identified separate expectations for three-year-olds, where expectations differed significantly from four-year-olds. For example, in the domain of Language and Literacy, under the component of Writing, one indicator is: Represents stories through pictures, dictation, and play.

The expectations for children for this indicator include:

  • Understands that pictures can represent objects
  • Acts out stories or represents them with flannel board pieces
  • Draws a picture and tells a story about it
  • Labels pictures with words
  • Dictates to teacher a story about their picture
  • Uses characters or information from stories in the dramatic play

The curriculum lists teacher behaviors that support this learning, including:

  • Use props in dramatic play that allow children to act out stories and their own experiences
  • Ask children to tell you about their picture and write what they tell you
  • Give children many open-ended materials to explore and use for representation
  • Add props to the block and truck area to encourage representation

In addition to outlining expectations for children and teaching strategies to support children's development, the Planned Play curriculum is based on the use of long-term thematic units. Our teachers agreed that themes are appropriate for young children; they promote in-depth investigation and reinforce children's interests. Our teachers were also glad to have more time to inform and involve parents in the longer studies. During the 2000-2001 school year, teachers will participate in staff development activities related to the curriculum and use of thematic units.

We have worked hard to dovetail the curriculum with a range of standards and outcomes we want our preschool Head Start to address. A cover page for each curriculum domain lists relevant program measures and Head Start Performance Standards. In addition, the cover page lists the related NAEYC Accreditation Criteria with examples, plus the agency outcomes developed by the Community Action Agency for children from birth to five. Ongoing staff development helps make our standards and outcomes meaningful at the classroom level.

Getting to Know the New Curriculum

After initial drafts of several domains were completed, a group of teachers reviewed the curriculum and met as a focus group. They explained what they believed should be included in a curriculum and how they thought the curriculum would be received. In response to their input, we added lists of field trip possibilities and other useful classroom resources.

The curriculum was presented to the teachers during a training session in August 2000. The entire group reviewed the curriculum introduction. Each domain was reviewed by a small group of teachers who summarized the major points and reported to the larger group. The teachers' response to Planned Play was positive. They appreciated the well-defined expectations and the examples of what a child should be able to do typically by the end of Head Start. They made the following comments about the curriculum—

  • "This is going to make planning easier and more organized."
  • "I like having a framework for linking my planning to my assessment goals."
  • "I wish something like this would have been available when I was new to the agency."

We believe that involving the teachers in developing the curriculum and basing it on the already familiar assessment system, Work Sampling, diminished resistance to trying something new.

The Policy Council was directly involved in reviewing and providing direction to the curriculum. Parent input included the development of both an anti-bias statement and a transition plan from Early Head Start to Head Start. The Policy Council approved the curriculum in August 2000.

Next Steps

The Planned Play curriculum is a living document. We want to add input from teachers, such as descriptions of long-term studies and activities they have used in their classrooms. We also want them to note expectations that seem too advanced or too easy for preschoolers. Future plans include writing a parent guide to accompany the curriculum and developing a birth-to-three curriculum so our program will have a continuous curriculum from infancy through preschool. We will also be looking at how the curriculum meshes with the Head Start Outcomes Framework.

The Early Head Start specialists are also piloting a new assessment tool, the Ounce of Prevention Scale. When it is adopted, we will link their assessment to curriculum activities, as we did for preschool Head Start. We hope to have this work completed by September 2001.

Margo Dichtelmiller is an Assistant Professor at Eastern Michigan University. T: 734-455-2059; E: mdichtel@on-line.emich.edu.

Mary Cunningham DeLuca is the Director for Children's Services at the Community Action Agency in Jackson, Michigan. T: 517-784-4800; E: mdeluca@caajlh.org.

Brenda Webster is an Education Specialist with Head Start in Jackson, Michigan. T: 517-784-4800; E: bwebster@caajh.org.

Go to top

 

 

From Curriculum to Outcomes: One Program's Experience
An Interview with Mary Carr-Wilt

Guest Editor Judy David interviewed Mary Carr-Wilt for this article. Frances Jones-Baker, Children's Services Coordinator, also contributed information to this article.

Mary Carr-Wilt is the Program Manager of the Seattle Public Schools Head Start Program. For over four years, she has been working with her management team to improve the quality of their program, which serves 454 children in 13 schools. In this interview, Mary describes the history of their efforts, decisions they made, and ways they involved teachers in changing their curriculum and assessment system. She is excited about the results she has seen in classroom practices and child outcomes.

Q: What prompted you to make changes in the program?
I was newly hired in 1996 with a strong background in early childhood education and family literacy. At the time, Head Start was anticipating newly revised Program Performance Standards. Within our school district, there were other changes as a result of an internal monitoring and evaluation of our Head Start services. As a program, we were taking a close look at all of our systems. Our management team took the new Head Start regulations very seriously-particularly those related to child outcomes. The State of Washington was also defining language and literacy outcomes for children birth to five, plus the school district was responding to state and national requirements for K-12 educational outcomes. The climate was one of change.

Q: Where did you begin?
It was clear to me that our long-term survival would depend on demonstrating that Head Start was the first important step in the educational experience for Seattle's low-income children. As we began to assess the quality of our program, we realized there was a disparity in our classroom practices. While all of our lead teachers had CDAs (1/3 of the teachers held an Associate or higher degree), there was not a consistent link between their education and the richness of their practice. Our program lacked a strong infrastructure for monitoring and developing teacher competency. We were not reflecting the latest research findings about early childhood education or about language and literacy performance in low-income children. We believed that given the emphasis on literacy outcomes at the federal, state, and district levels, the best place to begin would be to study the mandated outcomes and assess our own practice.

Six classroom teams volunteered to sit on a Literacy Task Force in May of 1998 and were charged with the task of exploring the topics of literacy development and curriculum, language assessment, and prevailing practices in our Head Start classrooms. Consultants on literacy development met with the Task Force to present current research ideas. The new Head Start Program Performance Standards were reviewed and the group concluded that some standards were being addressed in all classrooms but no class was addressing all standards. We experimented with formal and informal language assessment in a small sample of children. We found some discrepancy in teachers' perceptions of the children's language skills versus their actual performance. Some teachers were surprised that the sample of children generally fell below average in vocabulary. The Leadership Team concluded that taken together, this information indicated a need to enhance our teachers' knowledge base in language and literacy as well as to deepen our classroom practices. The fact that about 38 percent of our children come from families whose home language is other than English and the fact that we serve a high number of children with special education needs also called for taking a special look at practices.

Q: Did the language assessment lead to curriculum changes?
Yes, we realized that if we were going to be able to provide the highest quality program for all our children, we would need to develop a strong system for ensuring quality from classroom to classroom. That system would have to link child assessment and goal setting to specific curriculum strategies and have a strong teacher training plan to support implementation.

In my former position with the State Department of Education, I reviewed many educational curricula. I believe that the DLM Early Childhood Program offers a comprehensive literacy-based approach that addresses the needs and values of our program and the district. DLM has 20 monthly curriculum units. Each unit has a teacher's planning guide for activities, children's books, and an assessment tool to mark children's progress. Materials are adapted for use with children who speak Spanish or have disabilities.

The editor of the materials was invited to meet with our Leadership Team to discuss the origins and scope of the materials. The Team agreed that the materials would provide a comprehensive vehicle for us to address and assess Head Start requirements consistently across our classrooms. We asked the members of the Task Force to field test a DLM unit-Friends and Family-for a month and give us their impressions. The Leadership Team wanted to make sure that key players supported the program's adoption if we chose to move forward.

When the editor came back to guide the Task Force on planning for the test unit, we invited all staff, Region X representatives, our special education and mental health partners, and other local early childhood programs. At the end of the pilot month, the Task Force listed their impressions of the DLM program strengths, concerns they felt we would need to address if we adopted the program, and strategies for dealing with the concerns. The classroom teams were largely pleased with the materials-they were well organized, bright, fun, had lots of choices, and were well received by the children. The group consensus was to go forward with the curriculum.

Q: What were the concerns and how did you address them?
The curriculum required a more detailed level of planning from teachers, including the identification of specific strategies linked to individual objectives. The materials called for regular and detailed review of child progress. Time management for planning and documentation was different. To provide some level of familiarity, several members of the Task Force adapted our current planning materials to use with the new program. We began the 1999-2000 school year with a three-day institute on the program, supporting staff to implement the first month's unit. There was a steep learning curve for some teachers and some were worried that having a selection of recommended activities would take away their creativity. Others were worried that they might be evaluated on their ability to use the materials. We reassured them that the curriculum was flexible and that they would have plenty of time to learn about it.

Frances, our Coordinator for Children's Services, met monthly with the teachers to discuss concerns and support them in the development of plans and materials for the units. There was some level of discomfort with the wide range of skill development (2 1/2 - 6 years) that the program addresses in its Developmental Outcomes Checklist. The staff went through this instrument framework and agreed upon the skills that they felt were appropriate for our four-year-olds when they leave Head Start. Staff adopted the final draft last spring and we are testing the outcomes this year.

Q: What assessment strategies are being used in your program?
We use the DIAL as a screening tool when children enter the program. DLM's curriculum units are month-long and at the end of each one, a unit checklist is used. A child's progress is noted across all areas of development. So now, on a regular basis, we collect data to use for curriculum planning and individualization and we have consistent data across all the classrooms. We have been using the DLM Developmental Outcomes Checklist twice a year to report on individual children's progress. We're working on meshing the monthly unit assessments, which are used internally in the classrooms, and the Developmental Checklist. Teachers also record observations on six children every week who are the focus of a weekly team meeting. By the end of the month, each child in the classroom has been discussed. We've set up a research design to see if our children are meeting the child outcomes on the DLM Developmental Checklist. Five children from each classroom will be assessed at three points during the school year. We will analyze the data by gender, home language, and special needs. We want to see if the indicators are correct and if children are progressing in every classroom.

Q: How are these initiatives in curriculum and outcomes linked to staff development?
We've revisited child development because we knew that if the teachers could talk about that, it would help them reflect on their own practice. So we've set up on-site tailor-made courses through a community college. The courses, Applied Child Development and Early Childhood Education Curriculum Planning, use our curriculum as a foundation for planning, discussion, and analysis. One third of our teachers don't have their AA degrees and they're getting credit; other staff members are taking the course to ensure that everyone is at the same theoretical level. We're also taking data from the research project and based on the children's outcomes, giving curriculum support to our teachers where it's needed.

Q: What have been some of the biggest challenges?
Change has been hard for all of us, particularly when we've been comfortable with our original way of doing business, and the change calls for developing new skills. There is always a lot of fear that we may not be able to meet the new requirements. And then there's the reality that in the learning stage, things take a lot longer to get done than when you are proficient. That learning curve period can be very frustrating.

Q: What changes have you seen in the program as a result?
Although we are at the beginning of this journey, early on we observed visible changes in such areas as richness of the classroom environments of our newer teachers and a larger variety of language experiences. Lesson plans across the program are much more specific. Activities are clearly linked to curriculum and child goals.

Q: Where does the program go next?
You could look at this process and think, "Oh, this is just about adopting a new curriculum," but that's only one element of a larger process. It's been more about looking at what needs to be in place to truly be able to ensure equitable high quality assessment and instruction, and child achievement across our program. The materials we've adopted are very important because the careful alignment between assessment and planning eliminates a lot of guesswork for staff. The DLM curriculum offers a framework for discussing child development, individual differences, and best teaching methods. That framework is the critical foundation for supporting staff and children in this challenge to ensure outcomes. We're working on comparing the DLM checklist with the Head Start Outcomes. We're considering whether an additional form of language assessment is needed to complement the checklist.

This year is about getting comfortable with the materials and process, reviewing our child development knowledge, and reflecting on how the two work together. Next year will need to be about monitoring and coaching for consistency. The following year, with a solid foundation of staff confidence and consistent practice, we'll be able to focus on the critical questions about the impact of our practice on language, literacy, and school readiness.

In the meantime, we're beginning the conversations within our district about how best to align our assessment and curriculum strategies with those of the K-12 system, so that it all clearly fits together for families.

Q: What advice would you give other programs grappling with outcomes and assessment issues?
Take this opportunity to find support in your community from people whose business is curriculum and assessment. We've used researchers and consultants from the beginning to inspire us because we don't have time to design it all ourselves. The regulations really require Head Start Directors to know child development to make informed decisions. Several of us got together and set up a series of four seminars for fellow directors; we presented information on curriculum, phonological awareness, and other topics. There is a lot of pressure on the federal and regional offices to enhance an infrastructure for training and support on curriculum, assessment, and staff development.

Mary Carr-Wilt and Frances Jones Baker can be reached at T:206-252-0960; E: mcarr@seattleschools.org and fjonesbaker@seattleschools.org.

Go to top

 

 

Using the High/Scope Preschool Child Observation Record (COR)
By Larry Schweinhart and Eileen Storer

The High/Scope Preschool Child Observation Record (COR) is a tool for assessing the development of children two to six years old. The COR meshes quite well with the Head Start Performance Standards and the new Head Start Child Outcomes Framework. It is developmentally appropriate and widely used in Head Start programs. Head Start teachers who complete it several times a year can assess how well their program contributes to children's development. Originally developed for use with the High/Scope curriculum framework, a Head Start grant ten years ago enabled High/Scope to further develop and validate the COR for use in any early childhood program, whatever curriculum it uses. The manual for COR presents evidence of its reliability (i.e., the instrument's ability to consistently measure what it intends to measure) and concurrent validity (i.e., the instrument's ability to provide similar results for what is being measured compared to other testing instruments).

To use the COR, teachers begin by observing children and writing notes to describe their developmentally significant behavior. These notes provide them with evidence to complete 30 items in six areas of development-language and literacy, logic and mathematics, initiative, social relations, creative representation, and music and movement. These areas closely resemble the domains described in the Head Start Child Outcomes Framework. Each item has five specifically described levels, giving the assessment tool a developmental perspective that a simple checklist does not have. The levels are developmentally appropriate, ranging from the easiest to the hardest level, for preschool children two to five years of age. Here, for example, are the five levels for the item on demonstrating knowledge about books—

  1. Child does not yet pick up books and hold them conventionally.
  2. Child picks up books and holds them conventionally, looking at the pages and turning them.
  3. Child picture-reads, telling the story from the pictures on the cover or in the book.
  4. Child follows the print on a page, moving his or her eyes in the correct direction (usually left to right and top to bottom).
  5. Child appears to read or actually reads a book, pointing to the words and telling the story.

High/Scope recommends that teachers participate in a two-day workshop on how to use the COR and offers these workshops throughout the country. They provide training in how to recognize developmentally significant behavior and describe it in anecdotal notes, how to select the item and item level that each anecdotal note represents, and how to report these results to parents and program officials.

High/Scope is now planning and preparing to expand the COR to both younger and older children. We have developed the High/Scope COR to be used with children from six weeks to three years (overlap ping with the preschool version that begins at age two). In the past few years, we have been working with elementary educators to develop a version of the COR for ages five to seven. We are currently conducting studies of the reliability and validity of each instrument and expect them to be available in 2001.

To consider the COR from the perspective of a director of a local program, we asked Ann Brown, director of the Michigan School Readiness program at the Learning Village, Inc., in Kalamazoo, Michigan, to answer a few questions on her program's use of the COR. Here are our questions and her responses.

Why did your program choose to use the COR for the assessment of young children?
The COR allows us to base our program on what we know about individual children. Teachers might think they already know their children, but the COR goes beyond that to organize our observations and understanding of children. It helps us see groups of children at different times of the day to help us determine if some times need to be planned more carefully. In addition, the COR facilitates our communication with parents. So many parents say, "All the kids do in this program is play. The teachers don't teach them anything." But they need to see the true picture. One of the challenges to programs like Head Start is to communicate to parents what their children are learning. The COR is a deliberate and focused way to communicate with parents about what their children are learning in Head Start.

Using the COR also means improving our interactions with children. Lately, we have been focusing on how well our teacher practices support children's initiative. For example, one day on the playground, a child wanted to walk up the stainless-steel slide. He was holding onto the sides and no other kids were around. His feet were slipping a little, but he was doing it. Referring to the COR items for initiative helped us decide it was okay for the child to do this as long as he was safe.

What do you think of the criticism that the COR is time-consuming?
Teachers who are not used to systematic observation of children's development do have to adjust to the added workload of the COR, but the effort pays off handsomely in their greater knowledge of their children's development, their ability to teach children developmentally, and their ability to communicate with parents about how their children are developing. It takes time to learn how to implement the COR well; it cannot happen overnight. It's taken our staff three years to really feel that we had put all of the parts of the COR together. The curriculum model was in place, the staff were trained, then we began attending to the COR - doing observations and putting them together. The first year, we only did the COR once on each child. Another year, we went from using the manual to the computerized system, which involved some learning. All of the steps took time, but I knew we were doing better than most other programs in the assessment system we were putting into place.

Do you think it is appropriate to use the COR as a screening tool?
No. We use it to develop program plans, and to get information about how to support children's development. We begin by writing anecdotes, which helps us identify issues to address. For example, we can learn about a child's language skills at meals over a period of time. The period of time is required to distinguish a true language issue from a child's lack of comfort in a new program. Observational evidence that is consistent over a few weeks is important to have before making a referral for a formal evaluation.

How does the COR help you with child outcomes?
One of my focuses this year has been to share with staff the outcomes for children by presenting them with pre- and post-program COR comparison data. The discussion helps staff focus on areas of child development and answer the questions: What do we need to know more about? What goals do we want to develop? We use the COR to assess child outcomes for reporting to the government. We are systematically assessing children using an instrument that I trust because it has proven reliability and validity. It's not just a checklist or a homemade assessment tool. The COR really focuses on staff responsiveness to what they do every day. It's feedback from the kids. Information from the COR is a continual topic of conversation.

Overall, the COR accomplishes three things—

  • You see children's actions in the context of the classroom and the home.
  • You see children not only alone, but also in relationship to their peers and adults.
  • The different parts of the COR inform you about all areas of a child's development.

Head Start's new requirement to assess child outcomes has the potential to radically transform the program. An excellent way to ensure that the transformation will enhance children's educational experiences is to use an observational assessment tool of established reliability and validity, and to ensure that it is both developmentally appropriate to children and user-friendly. The COR is such a tool.

Larry Schweinhart is the Research Division Chair and Eileen Storer is a Research Associate at the High/Scope Educational Research Foundation. T: 734-485-2000, E: LarryS@highscope.org, and EileenS@highscope.org.

Go to top

 

 

How Screening and Assessment Practices Support Quality Disabilities Services in Head Start
By Jim O'Brien

The Head Start Performance Standards do not require that any particular strategy, instrument or technique be used. Appropriate procedures, however, should conform to sound early childhood practice and be valid, measuring what they are supposed to measure, and reliable, yielding consistent results over time and across users. Agencies should consult with the program's content area experts in health, child development, and mental health, with parents, and with the Health Services Advisory Committee as they design and implement a developmental screening approach.

Guidance related to 45 CFR 1304.20(b)(1-3)

Head Start works with families and community partners to enable the early detection of obstacles to children's development and then intervenes to reduce or eliminate these barriers. For many children, enrollment in Head Start provides the first indication that a disability or health condition may be affecting their development. To promote developmental and learning outcomes for all children, Head Start programs must plan and implement a sound, systematic approach for developmental screening and ongoing assessment. This screening and assessment system must include the careful selection and administration of instruments and procedures and the competent interpretation of results. The system must be understood and used by the program and parents as a means to support developmental and learning outcomes for all children.

The Performance Standards require that within 45 days of a child entering Head Start, appropriate screening procedures must be completed to identify any developmental, sensory (visual and auditory), and behavioral concerns. These procedures should be appropriate for the child's age, cultural background, and language and be conducted in collaboration with parents. The Performance Standards also require that, when appropriate, standardized developmental screening instruments exist, they should be used, and consultants to the program should be involved in helping programs select procedures. Sound screening instruments are designed to have the sensitivity to identify children who need further assessment and the specificity to exclude those who do not.

For some children the results of screening procedures, combined with information available from the ongoing assessment of progress required for every child (1304.21), may indicate the need for referral for a formal evaluation by a professional. As the Performance Standards (1308.6(b)) note, even standardized developmental screening is insufficient to determine disability. This screening merely identifies those children who require a referral. The formal evaluation, utilizing multiple sources of information from the family and program (including the ongoing developmental assessment of the child) will more fully assess the child's status, and determine what intervention may be needed. (e.g., special education or related services). A successful screening and assessment system requires appropriate instruments and procedures. When staff and parents are well-informed and supported to understand and act upon the information the system provides, it will positively affect child outcomes. When staff and parents commit time and energy to necessary procedures (e.g., docu-mentation of parent permission, timelines, scores, etc.), they expect the program to act upon the information in a timely and systematic manner to address identified needs and concerns. A process that responds to the child's needs can reinforce the parents' expectations that future screening and assessment will be a meaningful activity for them and their child.

The remainder of this article describes approaches to involving all the stakeholders in the screening and assessment system and ways of increasing their understanding and effective participation.

Engaging the Meaningful Participation of Staff and Parents

The planning and implementation of a screening and assessment system requires coordination and communication within the program as well as with community partners. The Disabilities Services Coordinator has the responsibility to provide leadership to the Head Start team and others so that their activities lead to effective parent involvement and developmental services for children with disabilities. (See below) But in addition to these activities, the involvement of the persons with the greatest day-to-day influence on the child's development-the parents and the teachers-is critical.

Some Important Activities for Disabilities Services

  • Work with the program team to ensure that parents are informed of the screening's purpose, procedures and results, and kept informed throughout any formal evaluation that may be required.
  • Arrange for a formal evaluation of children who have been identified as possibly having a disability. Make a referral to the Local Education Agency/Child Find/Part C Agency as soon as the need is evident.
  • Assist Head Start parents and program staff to take an informed and active role in decision meetings required by the Individuals with Disabilities Education Act (IDEA).
  • Coordinate with managers and staff implementing health services, ongoing developmental assessments, and family partnerships to assure that the full range of information available is used continuously to inform appropriate program planning for children with disabilities.

Guidelines for Screening and Assessment

  1. Screening and assessment should be viewed as services-as part of the intervention process-and not only as a means of identification and measurement.
  2. Processes, procedures, and instruments intended for screening and assessment should only be used for their specified purposes.
  3. Multiple sources of information should be included in screening and assessment processes.
  4. Developmental screening should take place on a periodic basis. It is inappropriate to screen young children only once during their early years. Similarly, provisions should be made for reevaluation or reassessment after services have been initiated.
  5. Developmental screening should be viewed as only one path to more in-depth assessment. Failure to qualify for services based on a single source of screening information should not become a barrier to further evaluation for intervention services if other risk factors (e.g., environmental, medical, familial) are present.
  6. Screening and assessment procedures should be reliable and valid (i.e., consistent in their ability to measure what they are intended to measure).
  7. Family members should be an integral part of the screening and assessment process. Information provided by family members is critically important for determining whether or not to initiate more in-depth assessment and for designing appropriate intervention strategies. Parents should be given complete informed consent at all stages of the screening and assessment process.
  8. During screening and assessment of developmental strengths and problems, the more relevant and familiar the tasks and setting are to the child and the child's family, the more likely it is that the results will be valid.
  9. All tests, procedures, and processes intended for screening or assessment must be culturally sensitive.
  10. Extensive and comprehensive training is needed by those who screen and assess very young children.

Meisels, S.J. & S. Provence. 1989. Screening and Assessment: Guidelines for Identifying Young Disabled and Developmentally Vulnerable Children and Their Families. Washington, DC: National Center for Clinical Infant Programs, p. 24.

Common Pitfalls in Screening

  • Scheduling a screening when the problem is already observable. When trained staff report an obvious problem, a referral for a formal evaluation may be the appropriate first step.
  • Ignoring screening results. Some times, initial screening test results are not taken seriously and a "wait and see" attitude is adopted. Good screening instruments are usually right, and there is risk of harm from delayed diagnosis and intervention.
  • Relying on informal methods. Informal tools such as checklists often miss problems. Validated and standardized tools carry the burden of proof that informal measures lack. We would never select tools for blood lead or other medical screens with questionable or unknown levels of accuracy. Why do this with development?

Adapted from F.P.Glascoe and H.L. Shapiro, "Developmental and Behavioral Screening," 1999.

Communication

Head Start staff may have limited experience in discussing the results of screening and ongoing assessment with families. When the evidence suggests that there may be a developmental concern requiring more formal evaluation, some staff may be reluctant to present this information to parents. For some Head Start families, this may be the first time that a developmental concern has come to their attention, others may have had concerns but were reluctant to discuss them, and still others may have been trying to get information about their concerns for some time. Whatever the situation, the quality of the communication between staff and families will have an impact on a family's willingness to consider and act upon the screening and assessment findings. Training and supervision must support this important function.

Programs need a well-planned system for communicating the screening and assessment results to parents. When communication is not planned and purposeful, parents of young children with disabilities often relate a common story of suspecting a problem but being reassured that the child will "grow out of it." Parents are more likely to accept information when they believe that they have good communication with the person doing the screening. Head Start, in its ongoing parnerships with families, has an opportunity to communicate screening and assessment results to parents in a man ner that recognizes the child's strengths while systematically responding when a concern warrants it.

Staff members also need opportunities to explore and discuss with supervisors any reservations, questions, and concerns about making a referral.

There are hidden costs to discounting screening findings-missed opportunities for early intervention may complicate a problem. This issue is often present in screening for emotional and behavioral concerns that carry a stigma. If not addressed, the child's behavioral difficulties often produce rejection by peers. Managers and consultants need to solicit feedback from staff and parents on whether the screening and assessment are helping them support children's development. If staff perceive the procedure as having consequences (e.g., a stigmatizing "label") without resulting in useful guidance about how to address the behavioral concern, then they are less likely to endorse the system.

Throughout the year, programs should provide opportunities for feedback from staff and parents on the screening and assessment system. Inquire about what is useful, confusing, or perhaps being rejected. Provide feedback on what the system has accomplished. Acknowledge that any screening process will "detect" some concerns that, upon further evaluation, do not warrant intervention; it will also fail to detect some problems that do require intervention. No screening instrument is perfect but each is tested and retested to be better than unstructured observations and impressions. To the extent that families and staff see that the system takes into account their feedback, respects their knowledge of their children's development, and makes a difference for children, they will be more supportive of screening and assessment.

The Screening and Assessment Processd

Use Information Continuously

Screening and assessment systems should contribute to the Head Start program's ongoing efforts to help children reach developmental and learning outcomes. When implemented well, a system provides specific and timely information to inform teachers and parents about each child's progress and can support the individualization needed to address developmental outcomes for each child. For children with disabilities, the results of formal assessments and the objectives from individualized plans for special education and instructional supports provide guidance that must inform their daily experiences.

Remember that while ongoing assessment is, by definition, expected to occur throughout the program year, screening is most often associated with the child's entry into the program. Given the rapid growth and changes which young children display, screening should occur on a periodic schedule consistent with the Early and Periodic Screening, Diagnosis and Treatment program (EPSDT) recommendations. For most children, the screening and assessment system offers reassurance that the child is on track for achieving the expected developmental outcomes. Furthermore, staff and parents should be provided direction and support to remain vigilant and responsive to any concern that emerges after the initial screening period. Sound procedures have decision rules on when to conduct a rescreening or additional assessment.

Empower Parents of Children with Disabilities

Parents of children with disabilities can benefit from Head Start experiences that help them practice communication, advocacy, and decision-making skills using screening and assessment results for their children. Parents of school age children with disabilities often describe their early experiences with assessment reports and individualized planning as confusing and intimidating. Head Start can empower families with expectations they will carry with them into their child's school career-that assessment procedures and results must be explained to generate informed decisions, that parent concerns must be addressed, and that resources, including other parents, must be identified to provide support and guidance.

Parents of children with disabilities will need orientation to key concepts from the Individuals with Disabilities Education Act (IDEA), such as parental consent, evaluations, confidentiality of records, eligibility for special education and related services, services in the least restrictive environment, and rights to due process. Opportunities in Head Start for parent education on their rights and on the school's obligations under IDEA will help them develop a sound foundation for their child's school experience. The Disability Services Coordinator should play an important role in supporting parents' goals in these areas.

Managing Child Records in the Screening and Assessment System

Key features include—

  1. A record of the procedures used for screening and assessment
  2. Evidence that the child's family provided information in the screening process
  3. A report on the results of screening, including steps taken if further assessment was indicated (including obtaining written permission from the parents)
  4. Evidence that screening and any follow-up assessments and actions were completed in a timely manner
  5. Evidence that information is handled in accordance with the program's confidentiality requirements and is readily available for ongoing use by staff members who need to act on this information.

Ensuring Outcomes for All Children

As Head Start programs focus attention on how their practices contribute to developmental and learning outcomes, it is important to recognize that the 1998 reauthorization of the Individuals with Disabilities Education Act (IDEA) specified that children with disabilities should be included in state and local efforts to measure educational outcomes for children. Advocates for children with disabilities make a convincing argument that excluding children with disabilities from measures of child outcomes will reduce or deflect from a school's expectation, and accountability for ensuring, that all children make significant progress toward the desired outcomes.

Similarly, in Head Start, an approach that documents success only for those children without disabilities is not sufficient. All Head Start children, including those with disabilities, should receive ongoing assessment linked to Child Outcomes Framework, curriculum planning, and communication with parents. For most children with disabilities, the key to achieving positive outcomes will be the provision of needed supports. When the progress of children with disabilities is included in a program's self-assessment, the program can develop information on best practices that support the achievement of outcomes for children with a variety of strengths and needs.

Jim O'Brien is a Program Specialist in the Health and Disabilities Services Branch of the Head Start Bureau. T: (202) 205-8646, E: jobrien@acf.dhhs.gov.

Go to top

 

Family-Centered Assessment
By Leah Shatavsky Bratton

Our Early Head Start program at Early Education Services (EES) in Windham County, Vermont, serves 107 infants and toddlers. We work hard to ensure the participation of families in the assessment of their children. From the outset of our program, several beliefs have guided and informed our assessment process, including-

  • Parents know more than anyone else about their children and can provide meaningful and reliable information. Families' observations, ideas, and concerns must be central to planning and performing assessments and screenings.
  • Parents benefit from taking part in evaluations of their children. During the screenings or assessments, when parents focus on their child and get support and information, they increase their understanding of their child's development, strengths, and needs.
  • Parents should choose how they will participate in the assessment process. The more actively involved they are, the better experience it will be for them, their child, and the home visitor.
  • Including families in the process, in ways that they want to participate, sends the message that they are an important part of the assessment and more importantly, of their child's life. If we exclude parents from the process, we risk not only losing important information about the child, but parents may not fully engage in goal setting, planning, or the program itself-resulting in fewer benefits for the child and the family.

Keeping these ideas in mind, we do everything we can to include parents and to make screening and assessment interesting, fun, and worthwhile.

Parents: Partners in Assessment

When a child is enrolled in the program, the home visitor's first task is to get to know the family and the child. The goal is to establish a mutually trusting and respectful relationship, which is valuable because it enables parents and home visitors to work together to support the child's development and the parent-child relationship. When parents are active partners in the assessment (and intervention) process, home visitors and families share an understanding or belief in what is best for the child, the parents' priorities are acknowledged, and parents and professionals work toward shared goals.

In our program, observation and conversation are at the heart of assessment. We provide ongoing training and supervision to our home visitors in observing and recording infant and toddler behavior. We also use training materials and an observation guide that focus on coaching parent-child interactions. Home visitors become skilled at observing the subtle aspects of interaction that indicate the quality of the parent-child relationship and the ways parents and their young children communicate.

At the beginning of a home visit, the professional might ask a question like: "What kinds of things has Janie been doing since my last visit?" Noticing that 16-month-old Steven reaches for his bottle on the table and repeats "ba-ba", the home visitor may say, "What are some of the ways he lets you know what he wants?" His mother might begin to describe his emerging language and how talkative he is. These questions are open-ended, non-threatening and give parents a chance to say what they want. There is no right or wrong; the parent is the expert. The home visitor is an active listener and is skilled at engaging parents in an easy conversation about their child.

When home visitors describe their observations of the child's development, parents are delighted. It shows that the professional knows the child and appreciates his or her growth. These observations also validate the parents' competence as parents and their important role in their child's development. "I watched how Rosa turns her head when she hears your voice in the other room. She wants to know where you are!" Such an observation can also be an opportunity for the home visitor to talk about the social and emotional development of infants. In this way, parenting education is a natural outcome of ongoing assessment.

When parents and Early Head Start staff share their thoughts and observations of the child, it leads to planning and thinking about what goals to set and which activities will enhance the child's development. In one family, the toddler was taking great joy in his mobility-he was on the move all the time. When the home visitor saw how much the child wanted to practice his newfound skills, she and the family discussed how to encourage him, even though it was the middle of a very cold winter in Vermont when outdoor activity was limited. She suggested several indoor places in the community where he could walk and run. As trust between the families and the home visitors builds, the sharing of information and observations increases between them.

Using the ASQ Questionnaire

To supplement ongoing assessment based on observing the child and talking with the family, EES uses the Ages and Stages Questionnaire (ASQ), which is designed for child monitoring. The items represent milestones in five key developmental areas: communication, gross motor, fine motor, problem solving, and personal-social. There are six items in each category and they are checked as either: yes, sometimes, not yet. Sample items at eight months include-

  • Does your baby make sounds like "da," "ga", "ka," and "ba?" (communication);
  • Does your baby pick up small toys with only one hand? (fine motor);
  • Does your baby feed himself a cracker or cookie? (personal-social).

At the end of the form, there are a few Yes/No questions appropriate for the child's age. For example, the ASQ used at eight months asks these final questions:

  • Do you think your child hears well?
  • Uses both hands equally well?
  • When you help your baby stand, are her feet flat on the surface most of the time?

Parents are asked to explain any "no" answers. These last items can be used for screening purposes to indicate areas of concern.

Starting when the infant is 4 months old, the ASQ is used at designated intervals. We chose Ages and Stages because it is simple to use and has proven to be reliable and valid. The items are easy to understand and illustrated. The manual offers guidelines for determining whether children are at high or low risk in the various developmental areas. Concerns identified when completing the questionnaire are usually not a surprise to parents and home visitors, but most likely are questions and observations they have had on their minds.

Administering the ASQ is relatively simple and straightforward. Although it was designed for parents to use on their own, home visitors and families do it together. It does not take long to complete-maybe one half hour-but the most important part is that it is an opportunity to observe and talk together. Whenever possible, natur-al observations are used as the basis for filling out the ASQ. If one of the items pertains to the child's using the pincer grasp (thumb and forefinger), the child might be observed eating Cheerios during lunch, picking up one at a time. Sometimes our home visitors suggest ways to elicit the child's behavior, like inviting a parent to clap her hands and watch how the baby imitates.

Data from the ASQ are discussed at a meeting between the supervisor and the home visitor. At this time, the child's accomplishments and any areas of concern are highlighted and next steps can be planned. EHS has a large resource library with many curriculum books that staff can use to plan developmentally appropriate experiences for individual children.

Information from the ASQ is sometimes shared at the agency's case management meeting-referred to as "Family Update in a Nutshell," or F.U.N. At this meeting, held quarterly for each family, the group of EHS professionals discusses the family. The home visitor briefly updates the group and discusses relevant issues about the child and family. If necessary, interventions are planned or a referral is made.

Scales like the ASQ provide useful information about a child's skills, yet they can also cause undue anxiety for some parents when a child has not yet reached a milestone. Our home visitors understand child development and can reassure families that children develop at different rates. They also provide insight into the growth of the individual child by explaining that sometimes a spurt in one area of development means a plateau or even backsliding in another. A home visitor might explain: "The reason Kayla is not talking much is that most of her energy is going into learning to walk. When she masters that new skill, her verbal development will probably take off."

Using the Ounce Scale

To better link assessment, planning, and intervention, EES is piloting a new infant and toddler assessment measure, The Ounce of Prevention Scale, still in draft form. It not only provides information about what the child is doing but also helps parents and providers understand how children use those skills and abilities, and how the environment and parent-child relationship can support children. It focuses on everyday, naturally occurring, practical behaviors and accomplishments that are easily recognized by parents and others. The child's developing social competency and adaptive capacity are highlighted in the Ounce materials.

The Ounce incorporates multiple strategies of assessment, including an observational record and accompanying guidelines with questions and examples of children's behavior at different ages. A family album enables family members to become actively involved in making observations about the child's development and offers suggestions about ways to enhance the child's development and strengthen their relationship. Finally, a summary record assesses the child's mastery across different areas of functioning.

We may find that the Ounce Scale complements the ASQ and use both of them. It is too soon to tell. We know that any assessment in our program must involve parents and strengthen their understanding and appreciation of their children's unique characteristics and progress over time. Assessment must also help home visitors understand and adapt to the strengths of each family and respond to their priorities and concerns. When families and professionals are partners in assessment, everyone benefits.

Leah Bratton is the Early Childhood Coordinator for the home visiting program of Early Education Services. T: 802-254-3742; E: lbratton@sover.net

I would like to thank Mary Moran, Director of EES, and Dot Marsden, co-developer of the Ounce Scale, for their help with this article.

Go to top

 

 

Environmental Evaluations: The Key to Quality in Early Head Start Classrooms
By Martha Buell

Recent advances in the study of brain development clearly indicate that early experiences dramatically shape the brain's structure and function. In particular, research indicates that the caregiving setting of young children (including the adult-child interactions, daily routines, and equipment) affects their development in profound and long-lasting ways. In order to ensure that the caregiving environment provides young children with the best possible start, it is critical to evaluate their environment.

In our program, Northern Delaware Early Head Start (NDEHS), we take environmental assessment seriously. NDEHS serves 107 infants and toddlers throughout New Castle County, Delaware. Children and families receive services from NDEHS in three ways: a traditional center-based program (i.e., all of the services are provided through a center funded by Early Head Start), a home-based program (i.e., children receive services through home visits), and a Childcare Partnership model (i.e., visiting Early Head Start developmental services are provided to local family child care or center-based programs combined with monthly home visits).

A child is served through one option at any given time. However, the program has built-in flexibility to allow children to transition from home-based services to center-based services as slots are available and as family needs change. What adds further complexity to our program is that over 50 percent of our services are offered through subcontracts with a Head Start program that offers home-based and child care partnerships services and with two other community agencies that deliver center-based services. Given the range and diversity of the NDEHS program's options, environmental assessments are key to ensuring consistently high quality services.

Assessing the Environment

The Head Start Performance Standards and the PRISM monitoring tool are used to ensure program quality. They describe the features that must be included in a program. However, since they do not explain in detail how to design and run a classroom or family day care that meets the developmental needs of infants and toddlers, NDEHS uses other environmental measures as guides to ensure program quality. These measures include the Family Day Care Rating Scale (FDCRS), the Infant and Toddler Environmental Rating Scale (ITERS) for center care, and the Health and Safety Checklist for all out-of-home care. These instruments are based on sound principles that support optimal development for young children.

NDEHS uses these measures to validate effective practices and offer guidance on ways to improve quality. All out-of-home child care placements are assessed at least once a year using the appropriate instrument. The ITERS and FDCRS use a seven-point rating scale: a rating of 1 indicates unacceptable or harmful, 3 indicates minimal quality, 5 shows good quality, and 7 represents excellent quality. Depending on the significance of the particular item and its score, we are able to prioritize needs and designate necessary resources, such as materials or staff training, for the site. We ensure that all health and safety items on the ITERS or FDCRS are scored at a level 7 before we work to improve the scores of other items.

We incorporate the instruments into training and use the evaluation of the environment as an important vehicle for staff development. In this way, providers receive information about what is expected in a developmentally appropriate program. These measures also provide information about appropriate materials and how to use them with infants and toddlers.

As staff members learn about environmental assessment, we identify training areas needing more coverage and plan future training accordingly. The staff knows well in advance that environmental settings will be evaluated and what criteria will be used. In fact, the criteria themselves are useful teaching tools.

It is important to help providers understand how the environmental assessments are tied to the outcomes we want to achieve with children in our Early Head Start program. This becomes critically important for children with special needs because we place them in natural environments (i.e., in the same program or setting the family would choose if the child did not have a disability).

Because of our commitment to supporting early intervention in natural environments, we work to maximize the extent to which IFSP (Individualized Family Service Plan) goals are supported in the Early Head Start classroom, family day care, and the child's home and community. Our environmental evaluations are a useful means of identifying further modifications, materials, or arrangements that support inclusion.

Using an ITERS item as an example, the descriptor of a program getting a rating of 7 (excellent) on Number 16 (Books and Pictures) states, "Each infant/toddler given opportunity daily for at least one language activity using books, pictures or puppets. Cozy book area set up for toddlers to use independently." This item requires us to reflect on how we will modify a language activity with puppets for a child with a visual or hearing disability. It may also challenge us to assess how we can make a book area accessible to a child with limited mobility. The evaluations are used to guide our program to meet all the needs of all the children.

Challenges to Evaluation

NDEHS has offered services for only two full years. We expect that many challenges to effective environmental assessment will be overcome as our program matures. However, one of the principal challenges remains the amount of time-at least three hours-it takes to complete each environmental assessment.

Currently, the assessments are conducted by the Early Care and Education Coordinator with help from graduate students from the University of Delaware. Our goal is to establish a system of teachers and providers who conduct their own assessments under the supervision of the Coordinator.

Another challenge is changing negative staff attitudes about assessment. Some staff members believe the process will focus on what is wrong with their approach or program and result in faultfinding and blaming. We try to reframe assessment as a tool to confirm best practice and a way to target resources that will make the providers' jobs easier. Enabling the providers and teachers to conduct their own assessments will help them accept the process.

Long Range Plans

Through newsletters, parent committees, and policy council meetings, family members are kept informed of the improvements that result from our environmental assessments and the ways in which these additional resources support their child's development. In the future, we would like to train family members to participate in these assessments, as we do with the child screenings.

In our program, parents and guardians have been trained to use the child screening tool, Ages and Stages, during the child's early years. They are responsive to being involved in this process and find it both educational and empowering. They gain valuable information about milestones in their child's development, acquire tools for child observation, and learn ideas for age-appropriate activities. Involving parents or guardians in environmental assessments would teach them about critical features of their child's learning environment and build rapport between the family and caregivers.

Another long-range plan is to develop or locate a measure to assess the child's home environment. Presently we conduct a health and safety check of each family's home. We have reviewed the H.O.M.E. instrument and concluded that it does not entirely address our needs. We have not yet found instruments comparable to ITERS and FDCRS that offer guidance for improvement and confirm current practices.

Conclusions

Environmental assessments help ensure a quality program for infants and toddlers. But the job of evaluation does not end there. These measures are only useful in conjunction with assessments of program effectiveness for individual children. When combined with individualized assessments, such as developmental portfolios, environmental assessments contribute to ensuring quality services for all children.

Martha Buell is an Assistant Professor in the Department of Individual and Family Studies and the Director of Northern Delaware Early Head Start. T: 302-831-6032; E: mjbuell@udel.edu.

Go to top

 

 

Using Assessment to Help Us Work with Families
By Stephanie Hudson and Mary Sinur

Assessment is used not only to monitor children's progress relative to Head Start Child Outcomes, but also to improve program services. The following article describes, from the perspective of the Coordinator of Early Childhood Education and a Family Services Advocate, how an Early Head Start program works with a staff development assessment tool.

A Coordinator's Story

Providing ongoing support and technical assistance to home visitors in Early Head Start programs takes a lot of knowledge, insight, patience, and trust. As the Coordinator of Early Childhood Education (ECE) and Disability Services for Project Eagle/University of Kansas Medical Center in Kansas City, one of my primary job responsibilities is to provide our Family Support Advocates (FSAs) with technical assistance in the areas of child development, parent-child interactions, early education, and early intervention. In addition to helping them achieve certification in the Parents as Teachers curriculum and the Denver II screening instrument, I conduct four to six training sessions a year. I work with 11 staff members and am responsible for assessing their ongoing work with families and helping them build their skills.

One way I support the Family Service Advocates is to work closely with them on all aspects of their home visits with families. I start out by helping to identify major concerns, issues, and goals prior to a family visit. We set aside time to debrief after each visit. I am required to make two home visits with each FSA every six months. More frequently, I am invited on a home visit by an FSA and I end up making closer to 25 home visits per quarter. When we make a visit together, I look for evidence relevant to our earlier discussions and evaluate the strengths and areas needing improvement. If appropriate during the home visit, I may take advantage of a "teachable moment" and model an intervention strategy.

Our program has developed the Continuous Quality Improvement Checklist (see attached CQI) to help me assess interactions during the family visit. Sixteen items focus on the Advocate's behavior. Each one is marked as either: 'needs improvement'; 'adequate'; or 'highly successful'. This tool emphasizes instructive and positive feedback to families. When I am observing the Family Service Advocates, I model the same style of feedback to them. At the end of the form, there is space for additional comments or observations. After the home visit, the FSA and I meet to discuss the CQI checklist and my overall observations and interpretations. He, too, discusses what he thought was happening during the visit and how he felt about it. We brainstorm ways to improve the quality of the visits and consider his next steps. We both sign off at the end of the form.

This assessment process is a strengths-based approach. I look at the FSA's abilities and learning style. Together, we come up with an action plan to help refine skills, deepen understanding, and increase confidence. Looking across the program, the CQI checklists help us plan staff development for the team of Family Service Advocates.

How we assess Family Service Advocates is not much different from how we assess children and their families. Our goal is to identify their strengths and competencies and from there, address their needs.

A Family Service Advocate's Story

In Project Eagle Early Head Start, I work with 14 to15 families every week. My first responsibility is to develop a partnership with parents and help them address their family's needs. In turn, I need ongoing support from the Coordinator to do my job well. What follows is a "family story" about how I work with a parent and her children, and how the Coordinator and I work together to meet the individualized needs of this family (whose names have been changed).

At the time of enrollment, May was 22 years old and a single parent. Her son, Tae, was one year old and she was expecting her second child in a few months. May was experiencing a high-risk pregnancy due to medical complications. She was living in a rent-assisted housing project and receiving cash assistance and medical coverage through the Medical Assistance Program. May's original goals were to have a full-term baby, find child-care for Tae (she was confined to bedrest during her pregnancy), and once the baby was born, work on her employment goals. May, a high school graduate, wants to go on with vocational training. She is very motivated to improve her family's conditions, and she has a close network of family members.

Our home visits occur weekly at May's apartment. We also have two agency-sponsored parent-child socialization play-groups each month. I have observed May with her children and the children with their fathers. May has good parenting skills. She is able to describe her children's development, understands their temperament, and sets appropriate limits. She maintains an open relationship with the children's fathers so that they can be involved in family life. The family is very interested in the motor development of the baby, Journie. They want to help her learn to move, roll over, and hold her head up. We talk about all areas of development, but motor development continues to be the one they consider most important. I think there is an underlying concern about Journie's motor development because of her mother's undetermined seizure disorder. In my role, I have to be aware of parental worries and how they can affect parent-child interactions.

When Journie was just a few months old, I invited the Coordinator, Stephanie, to attend the child-parent socialization group to help me identify more home visiting strategies. I asked her to help me observe Tae's and Journie's overall development and specifically to focus on the baby's motor development. Stephanie evaluated me by observing the first seven items on the CQI form. We sat together after the playgroup and reviewed the interactions I had with the family and she gave me feedback on the items. Stephanie pointed out that I listened carefully to May while she discussed Journie's motor movements during the playgroup. She also noted that I talked to May about the next developmental stages she should expect to see now that Journie is rolling over. Stephanie made me aware that I had pointed out to May that she had read Journie's cues correctly when the baby wanted to be picked up. Based on her CQI observation, Stephanie praised me for all the positive feedback I gave May. One area that I needed to improve on was asking more open-ended questions to find out from May what kind of progress and accomplishments Journie had shown during the past week. I found this suggestion very helpful and a reminder of how to improve my future interactions with May.

Stephanie also recommended additional ways I could support the family during our weekly visits. She pointed out that children like to be in different positions. The parents might try having Journie bear some weight on her legs and supporting her in a sitting position. Also, she talked about dangling a toy and moving it to the side to really encourage Journie to explore and move in that direction. We talked about how these activities could be incorporated into the family's routines, maybe during bathing, mealtimes or diapering. Using the CQI observation form to guide our discussion is non-threatening and useful; the items are specific and Stephanie finds a number of behavioral examples to back up her comments.

In Project Eagle, we use the Parents as Teachers curriculum and supplemental resources to individualize lesson plans for each family. It is up to me to understand developmentally appropriate practice and adapt the curriculum if needed. The Coordinator has helped me look at child-initiated play and the routines of the family to work out a program that promotes a "goodness of fit" between the parents and the child. The parents, in this case, like activities where they interact in active, hands-on ways. Though May likes to do things, her baby is more passive. We have tried to show how important it is to let Journie struggle just a little to build new skills. Journie's parents read her cues well and do not let her struggle too long.

It is also imperative that the Coordinator looks at the Advocate/parent "goodness of fit" when developing an appropriate plan for the family. Stephanie has encouraged me to let May's family "struggle" with new skills and give them time to explore new ideas. She has helped me to look for the family's cues indicating they have reached their limit and, in response, reduce or change activities. During one visit, she showed me how to address a family's concern by listening, offering suggestions, and providing immediate feedback. She has also shown me that flexibility during a visit is sometimes more important than following a specific plan.

The Coordinator is invaluable for helping me rethink the situation and think "outside the box." This process of learning results in changes in my ways of interacting during home visits and sharing information with the family. As I work closely with the Coordinator, my skills are enhanced so that I am able to provide family-centered, culturally sensitive, and individualized lessons for the child and family. When assessment is not a scary experience, but an opportunity for me to grow and learn, I benefit-and so do the families.

ECE Home Visit-Continuous Quality Improvement Checklist

Date: ________________ Advocate: _________________________

  • ________ Advocate is able to listen carefully to parents/primary caregivers (PCG) needs in the area of child development and parent-child relationships.
  • ________ Advocate asks "open-ended" questions that are helpful in gaining more information about the parent-child relationship or about the child's development.
  • ________ Advocate uses "instructive feedback" (i.e., Advocate responds to something specific that she/he observes while the parent-child are interacting) which points out why the parent/PCG's behavior is important to their child's development.
  • ________ Advocate uses "positive feedback" (i.e., positive, specific & contingent on the parent's/PCG's appropriate nurturing behavior with their child during the home visit) with the parent/PCG.
  • ________ Advocate demonstrates knowledge of child development and is able to review and discuss developmental domains with parents/PCG's.
  • ________ Advocate is able to observe parent-child interactions, read infant/toddler and parent/s or PCG's cues.
  • ________ Advocate is able to guide, coach and model appropriate intervention strategies that support the parent/PCG's interactions with their child and may be integrated into their daily routines.
  • ________ Advocate is able to administer a screening and an assessment in a developmentally and culturally appropriate manner.
  • ________ Advocate is able to review the ECE Recommended Practices/Outcomes with the parent/s or PCG and write individualized family and child action plans with them.
  • ________ Advocate describes which PAT lesson plan is being used and why with the parent/s or PCG.
  • ________ Advocate clearly states the objectives of the activity/lesson.

  • ________ Advocate presents the individualized lesson plan in a clear, concise and friendly manner.
  • ________ Advocate asks the parent or PCG how the "follow-up" activity worked for them.

  • ________ Advocate explains and reviews the focus of the next PAT lesson with the parent/s or PCG.
  • ________ Advocate provides parent/PCG with options for a "follow-up" activity.
  • ________ Advocate encourages parent/s or PCG to assess their child's developmental progress and to complete the observation and comment sections of the lesson plan.

Further comments/observations:

 
 
 
 

-Needs Improvement      Adequate      + Highly Successful

COORDINATOR of ECE ________________

ADVOCATE_______________

Stephanie Hudson is the Early Childhood/ Disabilities Services Coordinator for Project Eagle. T: 913-281-2648; E: shudson@kumc.edu.

Mary Sinur is a Family Services Advocate with Project Eagle. T: 913-281-2648; E: MSINUR@kumc.edu.

Go to top

 

 

Participating in a Research Project: A Head Start Program's Experience
An Interview with Gayle Cunningham
Guest Editor Judy David interviewed Gayle Cunningham for this article.

Gayle Cunningham is the Executive Director of the Jefferson County Committee for Economic Opportunity (JCCEO) located in Birmingham, Alabama, and Director of the agency's Head Start and Early Head Start programs. There are over 1300 children in 70 classrooms in both urban and rural sites. From 1995 to 2000, the Head Start program worked in partnership with the Georgia State University (GSU) Research Center on Head Start Quality. GSU is one of four Quality Research Centers funded to address the influences on quality and the impact of quality on children and families. They have used a number of different instruments to assess quality, including observations in classrooms, parent surveys, and staff interviews. The new Head Start mandates require that programs use information on child outcomes in local self-assessment. Outcomes information from this research project has been used for that purpose at JCCEO.

Gayle was interviewed for this Bulletin and asked to describe what it has been like to have Head Start participate in a research project. She describes how her program has used the research findings to improve classroom practice and Head Start services. For many programs, the notion of research is threatening to administrators, teachers, and parents. But in the case of Gayle's program, the collaboration has been very successful and helpful to Head Start. Gayle explains why.

Q: Why did you decide to join the research project on quality in the first place?
We'd had a successful research experience before. Over ten years ago, our Head Start program participated in the Transition Project, funded by the ACYF/HSB. At that time, my interest in research was very small; rather, I was interested in continuing the Head Start services from kindergarten to grade 3, which was part of the project. I thought evaluation was just a necessary evil. I came to appreciate the value and methods that Dr. Martha Abbott-Shim, the primary investigator, and the other researchers used to work with the Head Start staff, children, and families. When that research project ended, there was a new research Request for Proposals out on Head Start quality. Martha and I each read it and immediately thought of another collaboration. She was awarded the grant at Georgia State University and we partnered with her. From the very beginning, I was clear about my goals - to use the research findings to enhance staff development and our own internal processes for improvement.

Q: What has the research involved for your local program?
At various times during the program year, evaluators came into the classrooms to collect data. They observed, using the Assessment Profile for Early Childhood Programs II, which is similar to NAEYC accreditation criteria. They also interviewed staff and parents. The research project paid for a Research Coordinator on site. She was responsible for hiring and training the testers, collecting the data, and coordinating between Georgia State and our program. There was also a part-time researcher who collected data from our parents.

Q: What are the key elements that have made this research partnership so successful?
The on-site Research Coordinator helped everything work smoothly. The research project was not perceived as an extra burden by Head Start staff because she took care of the research tasks. She made friends with the staff and became part of our Head Start family. Staff was willing to cooperate because they trusted her. My message to my staff and parents was that we were privileged to participate in the research undertaking because it also would help us improve our program. We have shared results with staff, so this has been an open, aboveboard process.

Q: What have been the significant research findings?
Our Quality Research Center has found that Head Start teachers with more individualized teaching practices tend to promote greater overall developmental gains, and higher cognitive and social functioning for children. Also, these teachers reduce the effects of maternal depression on children's social behavior. An important finding for us was that the educational level of staff relates to their beliefs about children, their instructional activities, and, in turn, the quality of the classroom. The research also found that the teacher training we have offered positively affected classroom practice.

Q: How did you share the findings with the Head Start staff?
We discussed the research results at our management meetings first, and then presented summary findings at training sessions with teachers and aides. We never discussed the data on the individual classroom or cluster level (a grouping of several classrooms). The management staff knew the breakdown, but we did not want competition among the staff. We didn't want to reward or punish teaching staff for what the research found in their classrooms. We also shared the findings with the Policy Council so that the governing group of parents and community representatives would be well informed. Martha also came to several full staff meetings to talk in general about the results. Our focus has been on communicating what staff, teachers, and parents need to know to make improvements in the delivery of services. Since research findings are still coming out, this is a continuous process for us.

Q: Have the research results affected your program operation and classroom practices?
Definitely. We have used the results to plan staff development. The research has helped identify the areas where we need to improve. For example, when the children's data indicated that they were weak in some early literacy skills, we strengthened a variety of experiences in the classrooms. Now we see many more instances where teachers are inviting children to write, teaching children rhymes, sending books home with children, and creating print-rich environments. Data also showed that parent literacy is closely linked to children's literacy, so we started a number of initiatives, including a reading program for fathers and their children, and GED preparation. When the findings indicated that the children were not doing well in math, we beefed up that area of the curriculum by adding new materials and training. We also found that the better-educated teachers did more individualizing in classrooms. So we've encouraged, supported, and rewarded staff taking more teacher training courses at local colleges. Overall, the research findings have raised our expectations for teachers and the kinds of curriculum experiences they plan for children.

Q: How do you think the Head Start Outcomes will affect assessment in your program?
In reality, all programs already collect data because they do screenings and assessment. But they don't necessarily use the information effectively. That's because there have not been clear expectations about what to do with the data. With the new regs and Head Start Outcomes, there are clear expectations. We have to review and revise our assessment instruments to ensure that we obtain the required information. Our program has just begun to look at the Outcomes. We'll see if the domains and indicators plot back to our instruments, and we'll tailor them accordingly. Martha helped us develop our instruments and our teachers find them easy to use. They are flexible and we can adjust them. I believe they already measure most of the Outcomes.

Q: What advice would you give other Head Start programs that are considering whether to participate in a research project?
Make friends with the researchers and develop a comfort level with them. Find out exactly what you're getting into-how often they'll be visiting the classrooms, talking with parents, etc. Most important, make sure the research findings will be useful to your program. Research is a shared activity; it should not take away from your program, but add to it. As we approach the new Head Start Outcomes, program and evaluation folks need each other even more. Program staff should call on them for help with their instruments and data collection and analysis, using their expertise to improve the quality of services to children and families.

Gayle Cunningham is the Executive Director of the Jefferson County Committee for Economic Opportunity. T: 205-290-9251; E: gcjcceo@aol.com

Go to top

 

 

Improving Quality in the Classroom: Observations and Recommendations from the Head Start Quality Research Centers
Adapted from a presentation at the National Head Start Association Conference in Washington, D.C. on April 28, 2000.

In 1995, the Head Start Bureau funded four Quality Research Centers (QRCs) to work in partnership with local Head Start programs to define, assess, and verify the effectiveness of high quality program practices in Head Start programs. The four centers are—

  • Georgia State University (Principal Investigator: Martha Abbott-Shim);
  • High/Scope Educational Research Foundation (Principal Investigator: Lawrence Schweinhart);
  • Frank Porter Graham Child Development Center, University of North Carolina (Principal Investigators: Donna Bryant, Ellen Peisner-Feinberg); and
  • Education Development Center for Children and Families, with partners at Harvard University, Boston College, and Massachusetts Society for the Prevention of Cruelty to Children (Principal Investigator: David Dickinson).

A total of 16 Head Start programs were involved in the QRC effort. Research studies included 1,306 children in 367 classrooms. This article summarizes the findings of all four QRC projects.

What Did We Learn About Program Quality?

QRC researchers assessed program quality by observing classroom environment, curriculum activities, teaching strategies, and staff interactions with children. Classroom quality was defined as including the following elements—

  • Learning environments that have a variety of materials that are accessible to children and support diverse learning experiences.
  • Daily schedules that balance active and quiet activities, small and large group experiences, and child-directed and teacher-directed experiences.
  • Developmentally appropriate curriculum that includes alternative teaching strategies and opportunities for children to guide their own learning.
  • Positive interactions initiated by teachers, responsiveness towards children, and consistent behavior management.
  • Individualizing efforts that use developmental assessments and purposeful efforts to support language and literacy development in children.Key findings on classroom quality include-
  • Close to one-half of the classrooms score in the "good" range or above.
  • Measures of 'learning environment,' 'scheduling,' 'curriculum,' and 'individualizing' improved over three years while 'interacting' remained stable.
  • Head Start teachers are stronger in general early childhood education practices than they are in language, literacy, and curriculum practices.

How Is Program Quality Linked to Child Outcomes?

Quality classrooms matter because of the relationship between classroom quality and children's cognitive, language, and social skills development.

Cognitive and Language Development:

The QRCs assessed children by observation, direct assessment, and teacher ratings. Aspects of language development that were measured included: receptive vocabulary, communication skills, phonemic awareness, literacy skills, and letter-word identification. Cognitive measures included: general developmental status, math and reasoning skills, and school readiness.

Research findings related to cognitive and language development included—

  • Children show growth over the Head Start year in language, math, and cognitive abilities.
  • Child phonemic awareness and literacy skill scores increase over the Head Start year.
  • Spanish-speaking children increase more in receptive language over the Head Start year than English-speaking children, but remain below the levels of English-speaking children.
  • Children in better quality classrooms score higher at the end of the Head Start year than children in lower quality classrooms.
  • More individualized classroom practices are related to better cognitive development and fewer age-related differences.
  • Better language and literacy support in the classroom is related to greater literacy knowledge, phonemic awareness, and better math and reasoning skills.
  • Closer teacher-child relationships are related to greater phonemic awareness, higher developmental ratings, and better language skills in children.

Social Development

The QRCs measured five aspects of children's social development: positive behavior, prosocial behavior, problem behavior, positive initiative, and children's attitude/perception about Head Start and feelings about their competence. These aspects were measured via direct observations, teacher ratings, parent reports and child self-reports.

Research findings related to social development in quality classrooms include—

  • An increase in children's attempts to organize interaction with peers
  • A decrease in children's behaviors to accommodate others because they are better able to plan their interactions and have less conflict as a result
  • An increase in children's task completion rates
  • An improvement in children's social skills

Even after controlling for age, gender, and language, classroom quality still makes a difference for Head Start children as follows-

  • Fewer problem behaviors and more positive behaviors among children outside the Head Start classroom
  • More pro-social behavior among children
  • Less "purely social" play among children that is not goal-oriented

Developmentally appropriate curriculum is related to—

  • More positive initiative and fewer problem behaviors among children
  • More positive attitudes about Head Start and self

Close positive teacher-child relationships are related to—

  • Fewer problem behaviors
  • Positive attitudes and perceptions about Head Start and self

Conclusions

In summary, we have learned the following about classroom quality-

  • Head Start classroom quality makes a difference in children's growth and development and readiness for kindergarten.
  • Children in higher quality classrooms are doing better at the end of the Head Start year in cognitive, language, and social skills.
  • Different aspects of classroom quality, namely, teacher-child relationships, global classroom practices, and specific classroom practices are related to different areas of child development.
  • Quality improvement efforts need to consider both classroom practices and teacher-child interactions as well as provide training in implementing a variety of specific teaching practices.
  • There is a need for special emphasis on developing language and emergent literacy skills because observational evidence indicates that classroom practices related to developing language and literacy skills are weaker than other early childhood education classroom practices.

We have also learned that quality is affected by a number of attributes, including-

  • There is an indirect influence of teachers' formal education on classroom quality through teachers' beliefs.
  • Experience in teaching in Head Start programs is not in itself sufficient to guarantee high classroom quality.
  • Teachers who have more developmentally appropriate beliefs and practices tend to have higher quality classrooms and higher quality interactions with children.
  • Teachers of high quality classrooms tend to hold more positive views of children's parents.
  • Classrooms with higher quality adult-child and other interactions tend to have fewer children per class and fewer children per adult.
  • Several factors are related to teacher job satisfaction, which in turn influences program quality. These include the perception among teachers that policies are clear, that administrators are supportive, that communication with administrators is possible, and that well-qualified teachers/aides are hired.
  • Staff turnover also is related to job satisfaction. Features of the job that are related to lower turnover are feelings that the center is a collegial, innovative environment and that one's supervisor is supportive.

Go to top

 

 

Getting Inside Outcomes
By Barbara B. Rosenquest

The Rhode Island Child Outcomes (RICO) Project is a new statewide initiative, created in 1999, to define and assess outcomes across children's programs. The initiative involves collaboration among teachers and administrators in Head Start, the Department of Education, the Department of Human Services, the Head Start Quality Improvement Center, and RI Kids Count. Efforts to date have moved the groups closer together to develop a shared vocabulary and a general agreement on important areas of development for Rhode Island's children.

The three key functions of the project are-

  • Assisting local programs in developing a common set of practical, relevant outcomes that can be used to determine the impact of classroom practices on Head Start children.
  • Drawing on the knowledge of teachers to identify indicators of learning and achievement as children exit Head Start and enter school prepared to learn.
  • Using data on child outcomes to guide efforts to improve teaching practices and to target plans for staff development

The RICO Project predates the development of the Head Start Child Outcome Framework but the efforts overlap the intent of the Head Start Bureau to develop a picture of child competency.

Developing Outcomes: Three Phases

Over a four month period, from April - June 2000, the Project evolved in three phases:

  • Phase 1: Defining Developmental Domains and Child Outcomes
  • Phase 2: Gathering Data as Examples of Indicators
  • Phase 3: Pilot Study of RICO Outcomes

Phase 1: Defining Developmental Domains and Outcomes

The project began in April 2000. At the first meeting a group of Head Start teachers and Education Coordinators reviewed a preliminary list of assessment items extracted from the Head Start Performance Measures and from a survey of assessment instruments used in Rhode Island programs. After analyzing the assessment items, the group agreed on four general domains for outcomes: Literacy and Language Skills, Cognitive and Numeracy Development, Attitudes toward Learning, and Social and Emotional Well-being, along with a set of subcategories within each domain. The process of synthesizing developmental domains and subcategories from formal, validated assessment tools ensured that the RICO inventory was consistent with established knowledge and practice in the field. The preliminary list of RICO domains and subcategories was then compared to the child assessment measures used by local programs and found to be compatible.

The participants were asked to translate the RICO domains and subcategories into a form that would be useful in a Head Start program. They defined each general domain and subcategory in terms of specific outcomes and indicators evident in a Head Start classroom. For example, in discussing the domain of Attitudes toward Learning, the indicator 'Initiative' was described as, "The child plans for and makes choices about learning." The indicator, 'Investigates', under Cognitive and Numeracy Development, was further defined as, "The child explores, investigates, asks questions and makes predictions about the surrounding environment." The collaborative process involved in crafting the RICO definitions required participants to articulate the competencies they considered essential in the development of young children and their school readiness.

Phase 2: Gathering Data as Examples of Indicators

The group left the first meeting with the task of working with teachers in their programs to collect classroom observations illustrating how a child might demonstrate competence according to the RICO indicators. Teachers from seven rural and urban Head Start agencies gathered examples of child activity, along with information about the elements of teaching that contributed to the child's behavior. Each teacher noted how the classroom environment, the activities taking place, or specific interactions with the child may have affected the child's behavior. This phase tied the process of developing outcomes to teachers' observations and reflections.

Once observations were collected across programs, the teachers and Education Coordinators analyzed the data at a second statewide meeting. They matched behavior episodes to indicators. For example, behavior that related to the Cognitive and Numeracy Development subcategory 'Investigates' included: types of questions children ask, how children explore measurement at the sandbox, and categorizing collections of materials from a nature walk. The subcategory 'Initiative' was exemplified by: how children negotiate the choice board, assume roles in the housekeeping area, and direct stories at a puppet show.

Because the observation data were collected by teachers, the examples were immediately familiar, useful, and specific to the experience of Head Start classrooms. When similar observations related to one indicator were collected from different programs, the overall understanding of the indicator was reinforced. In other instances, the contribution of unique examples of child activity led to a deeper, more broadly defined indicator.

Phase 3: Pilot Study of RICO Outcomes

At the third and final statewide meeting, project participants met to devise ways to pilot the RICO. They suggested integrating the RICO instrument into the local assessment tools as a first step. As a result, programs will be able to identify specific RICO domains and indicators that are exceptionally challenging for teachers to observe and conceptualize. Then statewide or local program training can address these needs.

It was also decided that pilot studies will map the RICO domains and indicators onto the eight Domains and related Elements of Domains of the Head Start Child Outcomes Framework. Once data on children are collected using the RICO, it will be possible to determine other training needs related to program improvement.

Benefits of the RICO Approach to Defining Outcomes

The impact of the RICO Project was immediate. After the first meeting, participants were surprised to realize that everyone approaches assessment from different perspectives. As a result, several made changes in their program's child assessment instrument to address gaps identified from a comparison of their system with the expanded list of RICO outcomes and indicators. Other participants realized that some assessment items on their program's instruments were too broad to assess effectively. They drew upon the list of RICO indicators to further define and narrow those items.

The process of developing outcomes also became an informal needs assessment related to teacher training. Several participants acknowledged that requiring the teachers to reflect upon their teaching had helped staff realize that many small decisions are made each day to support children's learning. Some teachers were able to gather rich observations and to make the connection between their teaching strategies, the child activity they observed, and the RICO indicator. Other teachers recognized that they needed assistance in collecting observations or in articulating the impact of their decisions on child development and learning.

Involvement of Head Start teachers, administrators, state personnel, and researchers and consultants in the RICO Project created a multilayered applied research project. At each level, participants worked alongside each other to clarify their understanding of developmental theory, articulate their priorities for how children learn and develop, and analyze the teaching required to realize outcomes. This was a positive learning experience for all.

Including data from observations of Head Start children considered "typically developing" ensured that the process of determining outcomes was grounded in and enhanced by teachers' knowledge. Teachers became part of the process of identifying significant child outcomes rather than acting simply as agents of assessment.

Conclusion

In sum, the RICO Project represented a collaborative team approach to understanding outcomes that extended from the state level into the classroom. The process will surely lead to improvements in program outcomes and child progress. At the conclusion of the RICO pilot studies, Head Start programs will decide whether to use the RICO as the assessment tool with the Head Start Child Outcomes.

Barbara B. Rosenquest is an Assistant Professor of Education at Wheelock College. T: 617-879-2158; E: brosenquest@wheelock.edu.

Go to  top

 

 

Five Steps to Assessing Child Outcomes
By Mary Lou Rush, Dawn Denno, Edith Greer, and Ann Gradisher

Introduction

Three years ago the Ohio legislature mandated that all state-funded Head Start programs measure the progress of the children they serve using a common assessment instrument. The responsibility for designing the measurement system fell to the Office of Early Childhood in the Ohio Department of Education.

In Ohio, we are serving over 80 percent of the children who qualify for Head Start services. We are able to serve this many children due to gubernatorial and legislative support to expand state Head Start funding from less than $14 million per year in 1990 to over $100 million this year. Along with this funding came an underlying concern about accountability. Legislators began asking: How well are the children doing in the programs we fund? Did Head Start give children a head start? Did the children enter kindergarten better off than they would have been if they had not been enrolled? For a long time, we avoided addressing the "results" questions. We should have listened more carefully and reacted more quickly than we did. In 1996, one of the research arms of the Ohio State Legislature conducted a study that found little positive evidence of the impact of Head Start on children's literacy and social competency skills. We had very little information to refute the findings.

The handwriting was on the wall. We had to put a system in place that would provide the data to demonstrate the impact of Head Start on children. At the same time, we determined that we were also going to include our public preschool programs and preschool special education services in our outcomes system. This meant that, eventually, we would be collecting data for approximately 80,000 children in over five hundred local programs.

The system we developed is based on the Measurement and Planning System (MAPS) child assessment section of the Galileo software application. Teachers collect observational data on children's work in the areas of language and literacy, early math, social development, self help, and nature and science and enter the information into a computer. Data are collected at the beginning of the year to document skills the children have at entry. Teachers update the information as the year progresses and then enter data at the end of the year. This system gives teachers, parents, administrators, and legislators a comprehensive picture of the progress children are making over time in their early childhood program.

The purpose of this article is to describe the five key steps we took in Ohio to set up this assessment system. We believe our experience can help local Head Start grantees as they plan for and implement requirements from the Head Start Bureau to gather, analyze, and use information on child outcomes in new ways.

Step 1 - Getting the Help You Need

It would be a big mistake to enter into the task of starting an outcomes measurement system thinking that you have all of the answers. Asking for help maximizes the potential for positive consequences and minimizes the potential for negative consequences. So, deciding who to ask, how to ask, and what to ask is an important part of this process.

Some of the best help we got was by reading books and articles on assessment. A key message from our reading was to be clear on the purposes for assessment and to be sure the system serves those purposes. Thus, before we did anything else, we decided our two central purposes were to report on the overall levels of progress of children in Head Start in Ohio, and to provide assessment information that would be useful to teachers. That is, we wanted our new statewide system to reinforce what good teachers were already doing on a daily basis-observing and assessing children's progress to help make instructional decisions. Finding out what a child knows and is able to do helps teachers plan new experiences to advance learning. We wanted the assessment to fit into these daily routines, to provide a common approach to documenting progress of children, and to assist teachers in promoting progress.

A second major source of help in our planning was to draw on several stakeholder groups to help design our system. We convened a series of discussion sessions with groups including legislators, advocates, program directors, staff, parents, and state department staff-particularly those with expertise in assessment and information technology. Each group was asked the same question, "What outcomes do you expect from a quality early childhood program?"

Next, we held a synthesis meeting with representatives from each group of stakeholders to get a consensus on the final child outcomes and to begin developing a continuous improvement system for measuring and using outcome data. The synthesis meeting was scheduled for two days, but into the second day, we were still not agreeing on much. We were sensing resistance or reluctance from some Head Start staff members. Finally, one Head Start Director, Mary Hodge from Toledo, stood up, faced the group, and asked, "Why is this so difficult? We are talking about our bragging rights! Our early childhood programs work and we are deciding which of these outcomes we want to brag about! These are indicators of our successes!" Thus, the project was and will forever be called the Indicators of Success Project (although our personal preference was to call it the Early Childhood Bragging Rights Project!)

In addition to reviewing literature on assessment and conducting our public engagement strategy, we also sought as much technical assistance as we could find. For instance, we attended a meeting hosted by the National Early Childhood Technical Assistance System with other states that were wrestling with early childhood program outcome measures. This meeting helped us in conceptualizing an outcomes-based continuous improvement system for programs in our state.

Our advice is to get help from the beginning of your planning and decision-making process. We learned valuable things from reading, widespread involvement of stakeholder groups, and technical assistance services. The people and resources you identify may be different from those we used. Starting with the purposes of your child outcomes effort, we urge you to seek out help from experts and involve the people who will implement and use your child assessment system.

Step 2 - Selecting an Assessment Instrument

Once we had agreement on the purposes of our assessment initiative and the content areas of child outcomes, we began to work on selecting an assessment instrument for programs to use on a statewide basis. We worked with our stakeholder groups and experts to develop a set of criteria for choosing an assessment instrument. The full set of criteria looked like this-

Purpose of Assessment

  • Provide information to stakeholders about expectations
  • Be useful to teachers for planning instruction
  • Be useful to administrators for improving programs
  • Identify children who may require special interventions
  • Track child progress toward fourth grade curricula outcomes
  • Provide information for program accountability

Early Childhood Values

  • Collect data by observing children in a natural setting
  • Used with children from birth to age eight - all ability levels
  • Categorize observations in content areas and developmental domains

Technical Requirements

  • Provide data on individual children that can be aggregated at the classroom, program, and state levels
  • Provide descriptive statistics and gain scores
  • Available in computerized and paper formats
  • Compatible with the State of Ohio Education Management Information System

After an extensive review, we found the MAPS section of the Galileo software application most appropriate, given our criteria.

Many local Head Start agencies are currently reviewing their assessment instrument and evaluating other options, based on the new Head Start Child Outcomes Framework. As you look around to decide how to measure outcomes, we recommend developing a set of criteria, based on input from staff and other knowledgeable people, to guide your decisionmaking.

Step 3 - Aligning Curriculum and Assessment for Continuous Improvement

Once we had selected the MAPS assessment system, we turned our attention to connecting the assessment effort with curriculum in local Head Start and preschool programs. We began by "Ohio-izing" the MAPS assessment scales so that they directly measure the Ohio Department of Education's goals and expectations for preschool curricula. Then staff members from the Office of Early Childhood Education traveled around the state working with programs to align their curricula with the expectations.

Local programs worked to compare their curricula with the MAPS assessment framework. They reviewed their formal packaged curricula, and, in a number of programs, also convened work groups to analyze teachers' actual lesson plans to determine whether what goes on in the classrooms reflects the comprehensive scope of the outcomes they are hoping to achieve. Essentially, they are asking, "Are we providing the learning experiences to help our children reach the outcomes set forth for Head Start programs in Ohio?"

Two issues surfaced. First, some curricula did not adequately address all of the areas of child outcomes included in the MAPS assessment framework. Mathematics learning was the area most commonly found to be inadequately addressed in local curricula. The second issue was the reverse - some curricula addressed outcomes that were not measured by the Ohio outcomes system. For example, one of our programs, Miami Valley Child Development Centers, has made a significant investment to implement the High Scope curriculum. Education staff carefully compared the High Scope curriculum with the outcomes measured in the MAPS assessment system. They found that the outcomes in science measured by the MAPS tool were more comprehensive than those in the High Scope curriculum. However, in the art, music and movement content areas, they found that the outcomes in the High Scope system were more comprehensive. (Art, music, and movement are not content areas for which the State of Ohio requires outcome measures.) Miami Valley decided to collect data required by the state of Ohio and information about the areas of art, music, and movement that they believe are important goals for children in their community.

Spending time to analyze information on child outcomes is worthwhile if it helps programs answer key questions like, "How are our children doing?" "Is what we are doing working?" and "How can we be even more effective in preparing children for school?" Making sure your curriculum and assessment systems are lined up with a common set of goals for children is crucial to answering these questions in useful ways. Then you can use information on children's progress to plan for continuous program improvement. It is also important to be sure that your curricula reflect the values and priorities of your staff, program managers, families, and community.

Step 4- Testing Your System

We knew that we had a lot to learn about how this assessment system would work for all of our programs. We decided to field test the system to be sure we were getting what we needed and that the system was not overly burdensome on our programs. Fifteen Head Start programs, three public school preschool and three preschool special education programs volunteered to participate in the field test.

The field test allowed us to try out approaches with a manageable number of motivated programs. They were our practice group for funding, technical assistance, training, implementation, and reporting. We convened another broad-based advisory committee of forty stakeholders to meet quarterly to receive updates and advise us on the policies and procedures. To evaluate our field test, we conducted focus groups, interviews, and surveys of teachers, education administrators, directors and parents. Results indicated that the teachers and administrators were satisfied with the system. They believed the system was useable, the items on the scales were meaningful and the system provided useful reports. Parents of children also said that the reports were useful and understandable.

We found some areas that needed improvement-

  • Teachers and assistant teachers reported that they needed more training.
  • Teachers found that it was difficult to document progress using this system for children with very involved disabilities. For example, teachers of children diagnosed with autistic spectrum disorder complained that their children would not be likely to show any improvement on the language scale assessment items in an entire year.
  • Teachers reported that they were concerned about the amount of time they spent on assessment. Teachers reported spending an average of 2.5 hours per week on assessment, although the amount of time decreased as they became familiar with the system.

To address the areas of need, revisions have been made in the frequency and type of training offered. In addition, a committee to support children with complex disabilities came together to design a system which will provide teachers with more finely differentiated information about the progress of their children.

We believe it is valuable for local agencies to test changes in ongoing child assessment and procedures for analyzing information on children's progress and accomplishments. Implementing new efforts in a limited set of classrooms and centers can uncover problems and help fine-tune your system, and contribute to more successful implementation.

Step 5 - Ongoing Implementation and Problem Solving

This year, approximately 30,000 children are being assessed in the Indicators of Success Project. We have learned a lot over the last three years. We are continuing to work hard to build a system that will work for teachers and children during this period of rapid expansion and implementation. Our priorities focus on training, equipment, and continuing to communicate. We need to be sure that staff have the computers and training they need to use the system. To keep communication open, we set up committees on curriculum, the MAPS assessment scale, supporting children with complex disabilities, training, and technology.

One simple lesson we learned from implementing this project is that data on child outcomes at the beginning of each year are the most important and useful information to guide program improvement efforts. These data can help programs make better decisions in allocating resources, staff development, and technical assistance to improve the progress of children.

Another important lesson we learned is that when a system begins to hold programs accountable, teachers feel pressure to reach the specified outcomes. One program director reported that she observed her teachers walking around the room with clipboards documenting what was going on rather than facilitating learning. Another director told us that teachers were using "drill and kill" teaching strategies because they felt pressure to prepare children for state assessment efforts. We have communicated with teachers around the state, helping them understand that good instruction will be the deciding factor in improving child outcomes - not good paperwork. To clarify our support for developmentally appropriate practices, we created a User's Manual with many, many examples of how to observe and foster progress on outcomes in a developmentally appropriate way.

Conclusion

The implementation of the Ohio child outcomes assessment system has depended on relationships. Thousands of individuals have walked these five steps with us. We have continued to develop close partnerships that hold us responsible for the goals we all have for children. We have gained consensus on what outcomes to assess and how to measure them. And we have begun to be better able to communicate the very real impact our programs have on the lives of children and families.

When will our indicators begin to indicate success? Our hard work to document progress is already paying off. One large urban district has been able to document significant progress on some meaningful indicators. In the fall, they reported that 24 percent of their Head Start children could name ten or fewer letters. In the spring, 62 percent were documented as having done so. In the fall, 8 percent of their children could name eleven or more letters; in the spring, 40 percent could. In the fall, one percent of the children could write using some complete words; by the spring, 15 percent could do so.

Probably the comment that makes us the most proud came from a parent working on a committee to adapt our system to fit children with complex disabilities. She said that she believed that what we are doing in this area is very important because the system is strengths-focused. Her son has a significant brain injury and during the development of his Individual Education Plan, the school psychologist had written "Not applicable" in the section used to list a child's strengths. She said, "You are giving them something to write in that section."

We are far from finished with the design of this system. In fact, we do not intend to reach a point of completion. As we use data on children's progress to guide further improvements in programs and classrooms, we will continue to strive towards higher and more meaningful goals.

Mary Lou Rush is the Interim Director at the Ohio Department of Education, Office of Early Childhood. T: 614-466-0224; E: marylou.rush@ode.state.oh.us.

Dawn Denno is a consultant for the Ohio Department of Education, Office of Early Childhood. T; 513-874-1771; E: ece_denno@ode.state.oh.us.

Edith Greer is an Assistant Director at the Ohio Department of Education, Office of Early Childhood. T: 330-364-5567; E: ece_greer@ode.state.oh.us.

Ann Gradisher is an Assistant Director at the Ohio Department of Education, Office of Early Childhood. T: 330-220-6410; E: ece_gradisher@ode.state.oh.us.

Go to top

 

 

Head Start Child Outcomes Framework

DOMAIN: 1. LANGUAGE DEVELOPMENT

DOMAIN ELEMENTS: Listening and Understanding

INDICATORS:

  • Demonstrates increasing ability to attend to and understand conversations, stories, songs, and poems.
  • Shows progress in understanding and following simple and multiple-step directions.
  • Understands an increasingly complex and varied vocabulary.
  • For non-English-speaking children, progresses in listening to and understanding English.

DOMAIN ELEMENTS: Speaking and Communicating

INDICATORS:

  • Develops increasing abilities to understand and use language to communicate information, experiences, ideas, feelings, opinions, needs, questions and for other varied purposes.
  • Progresses in abilities to initiate and respond appropriately in conversation and discussions with peers and adults.
  • Uses an increasingly complex and varied spoken vocabulary.
  • Progresses in clarity of pronunciation and towards speaking in sentences of increasing length and grammatical complexity.
  • For non-English-speaking children, progresses in speaking English.

DOMAIN: 2. LITERACY

DOMAIN ELEMENTS: Phonological Awareness

INDICATORS:

  • Shows increasing ability to discriminate and identify sounds in spoken language.
  • Shows growing awareness of beginning and ending sounds of words.
  • Progresses in recognizing matching sounds and rhymes in familiar words, games, songs, stories and poems.
  • Shows growing ability to hear and discriminate separate syllables in words.
  • Associates sounds with written words, such as awareness that different words begin with the same sound.

DOMAIN ELEMENTS: Book Knowledge and Appreciation

INDICATORS:

  • Shows growing interest and involvement in listening to and discussing a variety of fiction and non-fiction books and poetry.
  • Shows growing interest in reading-related activities, such as asking to have a favorite book read; choosing to look at books; drawing pictures based on stories; asking to take books home; going to the library; and engaging in pretend-reading with other children.
  • Demonstrates progress in abilities to retell and dictate stories from books and experiences; to act out stories in dramatic play; and to predict what will happen next in a story.
  • Progresses in learning how to handle and care for books; knowing to view one page at a time in sequence from front to back; and understanding that a book has a title, author and illustrator.

DOMAIN ELEMENTS: Print Awareness and Concepts

INDICATORS:

  • Shows increasing awareness of print in classroom, home and community settings.
  • Develops growing understanding of the different functions of forms of print such as signs, letters, newspapers, lists, messages, and menus.
  • Demonstrates increasing awareness of concepts of print, such as that reading in English moves from top to bottom and from left to right, that speech can be written down, and that print conveys a message.
  • Shows progress in recognizing the association between spoken and written words by following print as it is read aloud.
  • Recognizes a word as a unit of print, or awareness that letters are grouped to form words, and that words are separated by spaces.

DOMAIN ELEMENTS: Early Writing

INDICATORS:

  • Develops understanding that writing is a way of communicating for a variety of purposes.
  • Begins to represent stories and experiences through pictures, dictation, and in play.
  • Experiments with a growing variety of writing tools and materials, such as pencils, crayons, and computers.
  • Progresses from using scribbles, shapes, or pictures to represent ideas, to using letter-like symbols, to copying or writing familiar words such as their own name.

DOMAIN ELEMENTS: Alphabet Knowledge

INDICATORS:

  • Shows progress in associating the names of letters with their shapes and sounds.
  • Increases in ability to notice the beginning letters in familiar words.
  • Identifies at least 10 letters of the alphabet, especially those in their own name.
  • Knows that letters of the alphabet are a special category of visual graphics that can be individually named.

DOMAIN: 3. MATHEMATICS

DOMAIN ELEMENTS: Numbers and Operations

INDICATORS:

  • Demonstrates increasing interest in and awareness of numbers and counting as a means for solving problems and determining quantity.
  • Begins to associate number concepts, vocabulary, quantities and written numerals in meaningful ways.
  • Develops increasing ability to count in sequence to 10 and beyond.
  • Begins to make use of one-to-one correspondence in counting objects and matching groups of objects.
  • Begins to use language to compare numbers of objects with terms such as more, less, greater than, fewer, equal to.
  • Develops increased abilities to combine, separate and name "how many" concrete objects.

DOMAIN ELEMENTS: Geometry and Spatial Sense

INDICATORS:

  • Begins to recognize, describe, compare and name common shapes, their parts and attributes.
  • Progresses in ability to put together and take apart shapes.
  • Begins to be able to determine whether or not two shapes are the same size and shape.
  • Shows growth in matching, sorting, putting in a series and regrouping objects according to one or two attributes such as color, shape or size.
  • Builds an increasing understanding of directionality, order and positions of objects, and words such as up, down, over, under, top, bottom, inside, outside, in front and behind.

DOMAIN ELEMENTS: Patterns and Measurement

INDICATORS:

  • Enhances abilities to recognize, duplicate and extend simple patterns using a variety of materials.
  • Shows increasing abilities to match, sort, put in a series, and regroup objects according to one or two attributes such as shape or size.
  • Begins to make comparisons between several objects based on a single attribute.
  • Shows progress in using standard and non-standard measures for length and area of objects.

DOMAIN: 4. SCIENCE

DOMAIN ELEMENTS: Scientific Skills and Methods

INDICATORS::

  • Begins to use senses and a variety of tools and simple measuring devices to gather information, investigate materials and observe processes and relationships.
  • Develops increased ability to observe and discuss common properties, differences and comparisons among objects and materials.
  • Begins to participate in simple investigations to test observations, discuss and draw conclusions and form generalizations.
  • Develops growing abilities to collect, describe and record information through a variety of means, including discussion, drawings, maps and charts.
  • Begins to describe and discuss predictions, explanations and generalizations based on past experiences.

DOMAIN ELEMENTS: Scientific Knowledge

INDICATORS:

  • Expands knowledge of and abilities to observe, describe and discuss the natural world, materials, living things and natural processes.
  • Expands knowledge of and respect for their body and the environment.
  • Develops growing awareness of ideas and language related to attributes of time and temperature.
  • Shows increased awareness and beginning understanding of changes in materials and cause-effect relationships.

DOMAIN: 5. CREATIVE ARTS

DOMAIN ELEMENT: MUSIC

INDICATORS:

  • Participates with increasing interest and enjoyment in a variety of music activities, including listening, singing, finger plays, games, and performances.
  • Experiments with a variety of musical instruments.

DOMAIN ELEMENT: Art

INDICATORS:

  • Gains ability in using different art media and materials in a variety of ways for creative expression and representation.
  • Progresses in abilities to create drawings, paintings, models, and other art creations that are more detailed, creative or realistic.
  • Develops growing abilities to plan, work independently, and demonstrate care and persistence in a variety of art projects.
  • Begins to understand and share opinions about artistic products and experiences.

DOMAIN ELEMENTS: Movement

INDICATORS:

  • Expresses through movement and dancing what is felt and heard in various musical tempos and styles.
  • Shows growth in moving in time to different patterns of beat and rhythm in music.

DOMAIN ELEMENTS: Dramatic Play

INDICATORS:

  • Participates in a variety of dramatic play activities that become more extended and complex.
  • Shows growing creativity and imagination in using materials and in assuming different roles in dramatic play situations.

DOMAIN: 6. SOCIAL & EMOTIONAL DEVELOPMENT

DOMAIN ELEMENTS: Self-Concept

INDICATORS:

  • Begins to develop and express awareness of self in terms of specific abilities, characteristics and preferences.
  • Develops growing capacity for independence in a range of activities, routines, and tasks.
  • Demonstrates growing confidence in a range of abilities and expresses pride in accomplishments.

DOMAIN ELEMENTS: Self-Control

INDICATORS:

  • Shows progress in expressing feelings, needs and opinions in difficult situations and conflicts without harming themselves, others, or property.
  • Develops growing understanding of how their actions affect others and begins to accept the consequences of their actions.
  • Demonstrates increasing capacity to follow rules and routines and use materials purposefully, safely, and respectfully.

DOMAIN ELEMENTS: Cooperation

INDICATORS:

  • Increases abilities to sustain interactions with peers by helping, sharing and discussion.
  • Shows increasing abilities to use compromise and discussion in working, playing and resolving conflicts with peers.
  • Develops increasing abilities to give and take in interactions, to take turns in games or using materials, and to interact without being overly submissive or directive.

DOMAIN ELEMENTS: Social Relationships

INDICATORS:

  • Demonstrates increasing comfort in talking with and accepting guidance and directions from a range of familiar adults.
  • Shows progress in developing friendships with peers.
  • Progresses in responding sympathetically to peers who are in need, upset, hurt, or angry; and in expressing empathy or caring for others.

DOMAIN ELEMENTS: Knowledge of Families and Communities

INDICATORS:

  • Develops ability to identify personal characteristics including gender, and family composition.
  • Progresses in understanding similarities and respecting differences among people, such as genders, race, special needs, culture, language, and family structures.
  • Develops growing awareness of jobs and what is required to perform them.
  • Begins to express and understand concepts and language of geography in the contexts of their classroom, home and community.

DOMAIN: 7. APPROACHES TO LEARNING

DOMAIN ELEMENTS: Initiative and Curiosity

INDICATORS:

  • Chooses to participate in an increasing variety of tasks and activities.
  • Develops increased ability to make independent choices.
  • Approaches tasks and activities with increased flexibility, imagination and inventiveness.
  • Grows in eagerness to learn about and discuss a growing range of topics, ideas and tasks.

DOMAIN ELEMENTS: Engagement and Persistence

INDICATORS:

  • Grows in abilities to persist in and complete a variety of tasks, activities, projects and experiences.
  • Demonstrates increasing ability to set goals and develop and follow through on plans.
  • Shows growing capacity to maintain concentration over time on a task, question, set of directions or interactions, despite distractions and interruptions.

DOMAIN ELEMENTS: Reasoning and Problem-Solving

INDICATORS:

  • Develops increasing ability to find more than one solution to a question, task or problem.
  • Grows in recognizing and solving problems through active exploration, including trial and error, and interactions and discussions with peers and adults.
  • Develops increasing abilities to classify, compare and contrast objects, events and experiences.

DOMAIN: 8. PHYSICAL HEALTH AND DEVELOPMENT

DOMAIN ELEMENTS: Fine Motor Skills

INDICATORS:

  • Develops growing strength, dexterity and control needed to use tools such as scissors, paper punch, stapler, and hammer.
  • Grows in hand-eye coordination in building with blocks, putting together puzzles, reproducing shapes and patterns, stringing beads and using scissors.
  • Progresses in abilities to use writing, drawing and art tools including pencils, markers, chalk, paint brushes, and various types of technology.

DOMAIN ELEMENTS: Gross Motor Skills

INDICATORS:

  • Shows increasing levels of proficiency, control and balance in walking, climbing, running, jumping, hopping, skipping, marching and galloping.
  • Demonstrates increasing abilities to coordinate movements in throwing, catching, kicking, bouncing balls, and using the slide and swing.

DOMAIN ELEMENTS: Health Status and Practices

INDICATORS:

  • Progresses in physical growth, strength, stamina, and flexibility.
  • Participates actively in games, outdoor play and other forms of exercise that enhance physical fitness.
  • Shows growing independence in hygiene, nutrition and personal care when eating, dressing, washing hands, brushing teeth and toileting.
  • Builds awareness and ability to follow basic health and safety rules such as fire safety, traffic and pedestrian safety, and responding appropriately to potentially harmful objects, substances and activities.

For more information on how to use the Outcomes Framework, see IM-00-18 on "Using Child Outcomes in Program Self-Assessment," August 10, 2000.

Go to top

 

 

The National Head Start Child Development Institute Ensuring Quality and Accountability Through Leadership
By E. Dollie Wolverton

More than 3,200 people gathered in Washington, D.C., in December to participate in the National Head Start Child Development Institute, which was sponsored by the Head Start Bureau. The Institute brought together education leaders from Head Start and Early Head Start programs around the country to increase knowledge and leadership skills, and to support participants in developing vision and action plans to improve local program quality, management systems, and child outcomes.

Participants and planners alike are calling the Institute a tremendous success. One participant said, "This has been one of the best learning experiences on child development. Thank you for bringing us the best of the best in the field...and making them available to us later in the evening for more discussion. I couldn't get enough!" Another stated, "As a Head Start staff member for 15+ years, this is the best training that I have ever experienced . ."

Program Highlights

Institute participants received advance reading materials and assignments, including a guided review of their own local program services and quality in the five priority themes of the Institute: curriculum and assessment; social and emotional development; mathematics and science; language development; and literacy.

During the Institute, participants heard nationally recognized experts address each of these themes. (See the chart on pages 52-53 for an overview of the Institute program and faculty.) Participants then had the opportunity to meet for small group discussion. Affinity groups offered facilitated discussion among education leaders from different communities on interpreting and implementing the ideas, research, and effective practices offered in plenary presentations. Leadership Team Planning sessions gave local teams the opportunity to develop vision and action plans for program improvement using the Implementation Planner. Dialogues with Experts sessions enabled participants to meet with Institute faculty for more in-depth discussion of the issues raised in their plenary presentations. Western Kentucky University is awarding three units of graduate or undergraduate credit for work successfully completed.

The Implementation Planner

The Implementation Planner was developed as a guide for participants to use before, during, and after the Institute. Its design reflects the Institute content and structure, and includes the following elements—

  • The Head Start Child Outcomes addressed in the plenary sessions each day
  • Some of the applicable Head Start Program Performance Standards for each session
  • Examples of Head Start systems as they affect the education leader's role
  • Space to record information on the participant's own program
  • Sections for noting important points from pre-Institute readings, plenary presentations, and discussion sessions
  • Space for developing ideas for improving child development and education services in the participant's own program
  • Space to reflect on the question, "What do I need to do as an education leader to affect positive change?"
  • The Head Start Child Outcomes Framework (See Framework on pages 45-50 of this Bulletin.)

The Implementation Planner was very useful because it is linked to the Head Start Program Performance Standards, as well as to management and leadership roles. It focuses attention on ways to improve current management systems in order to support more effective program services and positive child outcomes. It also poses questions to help leaders consider ways to improve the child development and educational aspects of their programs.

Next Steps

The Institute was a powerful learning opportunity for the thousands of Head Start managers who participated, and it was a first step in the larger initiative to enhance program quality and outcomes for children. The next step is to carry what was gained at the Institute into local team planning efforts that engage all Head Start and Early Head Start staff, parents, and community partners.

To support this work, the Head Start Bureau is developing a training strategy that will support a continuous cycle of local program improvement. This includes promoting professional development; implementing an appropriate curriculum; implementing programming that supports optimal child outcomes; maintaining accountability for child outcomes based on sound curriculum implementation and appropriate child assessment; establishing a common understanding and commitment to achieving new legislative mandates in child and family literacy; and enhancing the professional qualifications of staff - associate and bachelor's degrees in early childhood education for teachers of children ages birth to five years.

By the end of May 2001, multimedia educational materials based on the National Head Start Child Development Institute will be distributed to Head Start and Early Head Start grantees and delegate agencies. The package will include—

  • A set of six edited videotapes of the Institute faculty presentations
  • A companion guide to the videotapes that provides an introduction to each program segment; key speaking points of each presenter; some of the relevant Head Start Program Performance Standards and Child Outcomes applicable to each segment; bibliographical references and resources; and handouts distributed at the Institute
  • A copy of the Institute Implementation Planner

The Quality Improvement Centers and Quality Improvement Centers for Disability Services, along with the Regional Offices, will organize training events in each region to continue the work of the Institute. The multimedia materials will also be featured at the National Head Start Association's 28th Annual Training Conference, which will be held May 16-19, 2001, in Orlando, Florida.

Putting It to Work

Developing specific goals for program improvement is the responsibility of each local Head Start and Early Head Start program. The Head Start Bureau is committed to supporting local programs with information, materials, and technical assistance as they proceed to develop and implement their program improvement plans. It is our way of showing respect for each of you, and for the work that you do in meeting the changing needs of families and enhancing outcomes for children.

E. Dollie Wolverton is Chief of the Head Start Bureau's Education Services Branch; T: 202-205-8418; E: dwolverton@acf.dhhs.gov.

Celebrating 35 Years of Head Start

INSTITUTE GOALS

To increase the knowledge and leadership skills of local program managers in:

  • Supporting child development and learning in the domains of language development, literacy, mathematics, science, creative arts, social and emotional development, approaches to learning, and physical health and development;
  • Supporting school readiness and positive child outcomes in Early Head Start and Head Start through comprehensive child development services, age-appropriate, meaningful curriculum; child observation and assessment; and family involvement and partnerships; and
  • Enhancing the quality, intentionality, and effectiveness of staff interactions with children and families.

PROGRAM OVERVIEW

The Institute is a week-long learning experience for Head Start and Early Head Start managers with oversight for child development, education, and disabilities services through center-based, homebased, and family child care program options.

The Institute is grounded in the Head Start Program Performance Standards and the Head Start Child Outcomes Framework.

Institute participants received advance reading materials and assignments, including a guided review of local program services and quality in the five priority themes of the Institute. An Institute Implementation Planner guided participants in developing a vision and action plans to improve local program quality, management systems, and child outcomes.

Saturday

REGISTRATION AT HILTON WASHINGTON CONCOURSE
2:00 p.m. - 6:00 p.m.

Sunday

REGISTRATION AT HILTON WASHINGTON EXHIBIT HALL
10:00 a.m. - 10:00 p.m.

FACILITATORS' MEETING HILTON WASHINGTON, LINCOLN ROOM
3:00 p.m. - 5:00 p.m.

WELCOMING RECEPTION HILTON WASHINGTON, INTERNATIONAL BALLROOM
6:00 p.m. - 8:00 p.m.

Monday
9:00 a.m.

CURRICULUM AND ASSESSMENT PLENARY SESSION

Welcome and Institute Overview
The Head Start Bureau

"School Readiness and Our Children"

Barbara Bowman
Erikson Institute

"Curriculum, On-going Assessment and Child Outcomes"

Sue Bredekamp
The Council for Professional Recognition

BREAK - 10:30 a.m.

PANEL "Curriculum: Birth to Five"

Diane Trister Dodge
The Creative Curriculum

Ann Epstein
The High/Scope Curriculum

Eileen Borgia
The Project Approach

LUNCH - 12:30 p.m.

PLENARY SESSION
"Screening and Child Assessment"

Samuel J. Meisels
University of Michigan

BREAK - 3:30 p.m.

PANEL "Assessment: Birth to Five" Jacqueline Jones Graduate School of Education
Harvard University

Larry Schweinhart
High/Scope Educational
Research Foundation

Edward de Avila
Linguametrics

DINNER BREAK - 5:30 p.m.

DIALOGUES WITH EXPERTS - 7:30 - 9:00 p.m.
An opportunity to meet today's plenary presenters and to ask questions

Tuesday
9:00 a.m.

SOCIAL AND EMOTIONAL DEVELOPMENT

PLENARY SESSION
"The Importance of Social and Emotional Attachment"

Tammy Mann
Zero to Three - Early Head Start
National Resource Center

Ron Lally
WestEd - Far West Lab

BREAK - 10:30 a.m.

"Establishing Environments in Which Children Can Succeed and Develop Positive Behaviors"

Mary Louise Hemmeter University of Kentucky

Phil Strain
University of Colorado at Denver

LUNCH - 12:30 p.m.

SMALL GROUP DISCUSSIONS

Affinity Groups
Facilitated discussion among education leaders from different communities on interpreting and implementing the ideas, research, and effective practices offered in plenary presentations

BREAK - 3:30 p.m.

Leadership Team Planning
Work sessions for local teams of managers to develop a vision and action plans for program improvement using the Implementation Planner

DINNER BREAK - 5:30 p.m.

DIALOGUES WITH EXPERTS - 7:30 - 9:00 p.m.
An opportunity to meet today's plenary presenters and to ask questions

Wednesday
9:00 a.m.

MATHEMATICS AND SCIENCE PLENARY SESSION VIDEO

"Discoveries of Infancy: Cognitive Development and Learning"

Veronica Rodriguez
WestEd - Far West Lab

"Mathematics for Young Children"

Kathy Richardson
Mathematical Perspectives
Doug Clements
State University of New York at Buffalo

BREAK - 10:30 a.m.

"Science in the Early Childhood Years"

Carolyn Edwards
University of Nebraska

Karen Lind
University of Louisville

LUNCH - 12:30 p.m.

INTERACTIVE AFTERNOON
Performing Arts Stage
2:00 - 3:30
Hilton Washington
International Ballroom

2:00 - 5:30

  • Poster Sessions
  • Resources
  • Children's Art Gallery
  • Video Theaters

ICE CREAM BREAK - 3:30 p.m.

4:00 - 5:30

  • Showcasing Practices in Head Start and Early Head Start
  • Dialogues with Experts

CLOSING - 5:30 p.m.

Open evening for exploring Washington, D.C.

Thursday
9:00 a.m.

LANGUAGE DEVELOPMENT

PLENARY SESSION PANEL
"Language Development, Including English Language Learners"

Jerlean Daniel
University of Pittsburgh

Patton O. Tabors
Graduate School of Education
Harvard University

Kathy Escamilla
University of Colorado at Boulder

BREAK - 10:30 a.m.

PANEL

"Meeting the Needs of English Language Learners and Preserving Native Languages"


Nila Rinehart Central Council of Tlingit & Haida, Alaska

Graciela Italiano-Thomas
Centro de la Familia de Utah

Maryann Cornish
Higher Horizons Head Start

LUNCH - 12:30 p.m.

SMALL GROUP DISCUSSIONS

Affinity Groups
Facilitated discussion among education leaders from different communities on interpreting and implementing the ideas, research, and effective practices offered in plenary presentations

BREAK - 3:30 p.m.

Leadership Team Planning
Work sessions for local teams of managers to develop a vision and action
plans for program improvement using the Implementation Planner

DINNER BREAK - 5:30 p.m.

DIALOGUES WITH EXPERTS - 7:30 - 9:00 p.m.
An opportunity to meet today's and tomorrow's plenary presenters and to ask questions

Friday
9:00 a.m.

LITERACY

PLENARY SESSION

"Fostering Early Literacy in Classrooms and Homes"

Dorothy Strickland
Rutgers University

Susan B. Neuman
Center for the Improvement of Early Reading Achievement
University of Michigan

BREAK - 10:30 a.m.

"Approaches to Effective Family Literacy"

Sharon Darling
National Center for Family Literacy

Gerie Cruz
Former Head Start Parent

CLOSING LUNCHEON - 1:00 p.m.

KEYNOTE ADDRESS AND RECOGNITION OF PARTICIPANTS

"Educational Leaders in Head Start and Early Head Start: A Privilege and Responsibility"

Maurice Sykes
Early Childhood Leadership Institute
University of the District of Columbia

ADJOURN - 4:00 p.m.

Go to top

 

 

Resources

Available from the Head Start Information and Publications Center:

Observation and Recording: Tools for Decision Making This technical guide enhances the skills of education staff so that they can accurately and objectively record young children's behavior and make appropriate decisions about program planning for each child.

Enhancing Children's Growth and Development This technical guide expands on the concepts developed in Nurturing Children. It is designed to enhance the skills of education staff so that they can apply their knowledge of how children grow and develop to planning, implementing, and evaluating activities and experiences on-site, at home, and during group socialization sessions.

You may order publications through Head Start Information and Publication Center by calling 703-683-5767, faxing 703-683-5769, or visiting their Web site at http://www.headstartinfo.org/.

Resources Available from Zero to Three: National Center for Infants, Toddlers and Families

"Developmental Screening, Assessment, and Evaluation: Key Elements for Individualizing Curricula in Early Head Start Programs," Technical Assistance Paper No. 4 (October 2000). This paper describes the differences between screening, assessment, and evaluation, and their relationship to curricula and planning. It also includes a list of resources, definitions of common terms, and a review of some common screening and assessment tests.

New Visions for the Developmental Assessment of Infants and Young Children, Samuel J. Meisels and Emily Fenichel, editors (1996). This publication discusses the principles and guidelines of appropriate developmental assessment, how parents and professionals share responsibility for the assessment process, the importance of sociocultural background, and a number of other valuable topics.

New Visions for Parents Materials

This family information packet about developmental assessment includes a letter to parents, New Visions: A Parent's Guide to Understanding Developmental Assessment, Planning and Preparing for Your Child's Developmental Assessment, and List of Terms: Terms Frequently Used in Developmental Assessment. Most of this information may also be downloaded from the Zero to Three Web site.

You may order publications through Zero to Three by calling 1-800-899-4301, faxing 202-638-0851, or visiting their Web site at www.zerotothree.org

Selected Web sites with useful information include:

http://ceep.crc.uiuc.edu/eecearchive/books/fivepers.html - The entire text of Five Perspectives on Quality in Early Childhood Programs, by Lilian G. Katz-a book that is now out of print.

http://www.apa.org/ - The American Psychological Association. Search for "early childhood assessment."

http://www.nichcy.org/pubs/newsdig/nd23txt.htm - National Information Center for Children and Youth with Disabilities.

http://www.naeyc.org/ - National Association for the Education of Young Children.

http://eclkc.ohs.acf.hhs.gov - Early Childhood Learning and Knowledge Center.

http://zerotothree.org/ - Zero to Three: National Center for Infants, Toddlers and Families.

This list is not intended to be a comprehensive list of on-line resources for Head Start programs.

Go to top

Screening and Assessment in Head Start. Head Start Bulletin #70. DHHS/ACF/ACYF/HSB. 2001. English.


Digital Television Transition
The U.S. Government's Official Web Portal
El portal oficial en español del Gobierno de los EE. UU
 
Head Start LogoMaintained by the Office of Head Start
  Disclaimer | Contact Us | Privacy Policy | Site Map | Adjust Screen Resolution Optimized for 1024 x 768 | ECLKC toll-free: 1-866-763-6481