SEDL Home Southwest Educational Development Laboratory
     
  Connections Vol 1, No. 3, September 2000
Previous Page  Next Page
 

Monitoring and Checking Progress Finding and Addressing Problems Increases Success in Implementing School Reform

Problems and challenges are inevitable when implementing a comprehensive school reform program, no matter how well a program is planned. That is why the federal CSRD program includes evaluation as one of the nine components. By monitoring and checking progress throughout the implementation process, we’re able to identify problems, challenges and concerns, and address them quickly. Catching problems early ensures a smoother, more successful implementation and can result in an improved program. Monitoring and checking progress can also serve as a source of encouragement to teachers—they will be reminded that changes are being made, that the school is progressing toward its vision of comprehensive school reform.

 Checking progress can be accomplished in a variety of ways—formal and informal; qualitative and quantitative. Dennis Sparks, executive director of the National Staff Development Council (NSDC), explains how principals can incorporate different methods of monitoring and checking progress:

"Principals can check progress in a number of ways—some of them informal and some of them more formalized. They can be visible in the hallways and in the classrooms of the school by doing walk-throughs of classrooms or more extensive classroom visitations so that they have a sense of the challenges teachers face as they try to implement new strategies. They can be looking at student work with teachers to see if the quality of the work is changing as a result of the new approaches being used. They can look at data from across classrooms—formalized data that may be in the form of standardized tests or attendance information, for example."

He stresses, "Data are most useful when principals and teachers discuss it and make sense of it together. They should look at it as trend data so that they can go back several years and see what it used to be like and what it’s like today. Schools that are most successful, I have found, are schools that have had some training in data analysis and working together around that data. Because very often it’s quite difficult to understand what it’s about and what it means. So some training and lots of discussion among teachers with the principal is necessary to make sense of what it means and what it indicates the school needs to work on next to realize its vision."1

Like Sparks, Shirley Hord, SEDL program manager and researcher on school change, emphasizes the value of school leaders informally checking progress by consistently visiting classrooms and touching base with teachers. She says, "First of all, this lets teachers know that the administrators or leaders in the building are interested in what they are doing. And, second, it lets them know that this program they are trying to implement, the new work they are trying hard to do, is being appreciated and is a high priority for the school leadership." 2


More Formal Evaluations

Although Shirley Hord does not usually include formal evaluation as part of the strategy "Monitoring and Checking Progress," we are going o include evaluation as part of our discussion here, as it is a critical component of CSRD that helps schools assess progress. Two kinds of evaluations are often referred to in discussions of comprehensive school reform—summative evaluations and formative evaluations. Many times schools think of evaluation only in terms of summative evaluation, which assesses overall project success and often incorporates state standards and benchmarks, standardized test scores, and statistics such as dropout rates and attendance rates. Summative evaluations tend to look at the degree to which the program has met specified goals and objectives. Formative or process evaluation is important too, in order to make mid-course adjustments as it focuses on ongoing project activities. With formative evaluation we check our progress toward expected outcomes by asking such questions as "What is working?" "What should be improved?" "How should it be changed?" Assessments such as surveys, interviews, observations, and checklists can be used to develop formative evaluations.3

SEDL vice-president and chief operating officer Joan Buttram refers to the formative evaluation as an early warning device. She explains the importance of having an evaluation plan in place early on: "You won’t reach your end results if things that were supposed to happen along the way didn’t happen. Uncovering problems as they arise and addressing the problems promptly can make or break your final results. The evaluation helps ensure that everything is being carried out as it should be."

Image of Dennis Sparks

Jose Carrillo, principal of Martin Elementary School in Deming, NM, is an example of a visible principal. Carrillo often visits the classrooms and participates in class activities. Martin is one of the 21 CSRD awardee schools in New Mexico, and was one of the CSRD "step-ahead" schools that participated in the 1999 Improving America’s Schools conferences.

"All too often administrators feel they intuitively know what’s going on in the CSRD program, but they can be wrong," Buttram reports. "They often only talk to a certain group of people to get feedback or they only see a few parts of the program being implemented. A good evaluation plan can provide an overall view of how the program is being implemented."4

Some of the reform model developers include implementation checks as part of the technical assistance provided to schools. Margarita Calderón, who works with Success For All (SFA) schools around the country, strongly encourages implementation checks as part of a school’s CSR program. She says SFA representatives visit every classroom at each SFA school three times a year, using the same observation instrument. She reports, "We invite the principal, assistant principal, and curriculum specialist to come with us into these classrooms. After each visit, we debrief them so that we’re teaching them how to observe, what to observe, and how to organize the feedback that they will later give to the teacher. Implementation visits with feedback are probably the strongest element in ensuring that there is quality of implementation and that [the reform program] is impacting student academic achievement."

Planning for evaluation should be included as part of the planning process for your CSR program. Evaluation questions and objectives should be developed for the different components of the school’s comprehensive school reform program. For example, consider the professional development component. To assess this component of the program, a school might ask itself several questions, such as "Did all of the teachers participate in necessary training sessions?" "Were the people who conducted the sessions well trained and effective?" "How have the professional development activities created change in classroom practices and teacher effectiveness?" "Do teachers need additional training or coaching?"

 To answer this first question, a school could look at the attendance records for the professional development. If not all teachers participated, but should have, then further probing may be needed to determine why they did not.

One way to answer the last two questions may be to observe teachers in action in their classrooms. Another way is to survey teachers. Sharron Havens, assistant superintendent for instruction in Lonoke, Arkansas, explains how her district does this either on paper or in small group discussions:

"We often have teachers fill out a survey maybe a month or two after the professional development and ask them, ‘How much are you implementing the ideas you heard in the workshop? To what degree are you implementing these? Have you implemented any of the ideas?’ Kind of a checklist where they can indicate the level at which they feel like they have implemented what they learned in training."

The district also queries teachers about why they are not implementing the strategies or skills the session focused on. Havens continues, "We ask them, ‘Do you need some additional training? What can we do to help you better understand the professional development or understand why you’re not implementing what you’ve heard?’"

Havens reports that it is often difficult to look at student achievement data and determine whether a professional development session has impacted instruction and learning because of the number of interventions or strategies that are being implemented at one time in a particular achievement area. "For example," says Havens, "the current goal with the school reform initiative in the elementary school is to focus on reading and that includes lots of different kinds of professional development related to reading. But it also includes some changes in reading practices. It includes changes in the focus of the school, making sure that kids are aware that reading is important. Even the signage in the school lets people know that we’re focusing on reading. So it’s very difficult to know that the reason for the increase in student achievement is related to a particular professional development session."5


External Assistance May Be Needed

Schools may feel overwhelmed trying to determine how to focus the evaluation and how to integrate the collection of evaluation data into existing procedures. Buttram stresses the importance of getting help from an outside consultant if needed. "Someone from the outside can give you a fresh perspective. Sometimes school staff are more willing to talk to an outside person who doesn’t have a stake in the program—they are more likely to be honest about what is or isn’t happening."6

CSRD schools in large districts may have an advantage when it comes time to develop an evaluation as most large districts have evaluators on staff. For schools in small districts where such help isn’t available, Buttram suggests turning to area universities and colleges or to a regional education laboratory. When seeking help from universities, check with the departments of education, social services, and/or psychology. These are the departments that tend to have faculty with evaluation experience. Schools may find professors who are interested consulting or who may be willing to supervise graduate students to assist in designing and conducting evaluations.

A final part of the evaluation is to put the findings into use. This means creating opportunities to discuss findings with staff and decide if changes should be made. It means celebrating successes and learning from mistakes. It also means sharing findings with stakeholders outside of the school building—the superintendent, the school board, parents, and community. Keeping stakeholders informed and interested can bolster the support for your CSR program.

How Did Sierra Vista and Sunrise Measure Up?

One reason Sierra Vista’s reform program was thriving was that Ms. Martinez regularly led her staff in looking at student data. Sierra Vista teachers seemed to enjoy studying data and determining what progress their students were making. Ms. Martinez also demonstrated how important she thinks the teachers’ work was by visiting the classrooms and following up with teachers regarding their instruction. She served as a valuable support system for her staff and set the tone for the entire school reform program.

On the other hand, Ms. Smith had not begun to promote the study of student data on a regular basis nor did Sunrise teachers have organized discussions about the informal indicators of school change, such as student attitudes or how certain students were struggling. Due to this lack of reflection, the staff not only missed out on seeing what adjustments should be made to their program, but also missed what may have been valuable indicators of progress in their school reform program—progress that the consultant from the model developer’s office saw easily. Ms. Smith had the right idea trying to implement teacher portfolios, as she realized the need for her staff members to view their progress through a lens other than that of standardized test scores. However, as mentioned previously, the staff needed additional training to make the portfolio process successful.


For Discussion or Reflection

  • How do we obtain effective tools and processes to use in assessing our progress?

  • What types of data do we need to help us assess our progress?

  • What are the possible explanations contributing to our findings? How can we use this information to improve our program?


Resources for Planning Your Evaluation

Two products developed by regional education laboratories can help you plan an evaluation for your CSRD program:

  • Evaluating for Success by Louis F. Cicchinelli and Zoe Barley, published by Mid-continent Research for Education and Learning (McREL), can be found online at http://www.mcrel.org/csrd/evalguide.pdf

    This is a good basic guide for developing the evaluation component of your CSRD program. It includes eight worksheets that can be used in planning an evaluation and provides examples of completed worksheets.

  • Developing Your School’s CSRD Evaluation Plan: An Awareness Workshop for Local Schools, published by the Northwest Regional Education Laboratory (NWREL), may be ordered by calling Janice Wright at 1-800-547-6339.

    The overall goal of the Evaluation Awareness Workshop is to familiarize school practitioners with the benefits of a strong local evaluation of CSRD. The training package includes a sample agenda, scripts, overhead transparencies, participant handouts, and sources of further information.

1Interview with Dennis Sparks, February 2, 2000.

2Interview with Shirley Hord, January 10, 2000.

3Collins, Patrick (ed). Developing Your SchoolÕs CSRD Evaluation Plan: An Awareness Workshop for Local Schools. Portland, OR: Northwest Regional Educational Development, 2000, p. 12-13.

4Interview with Joan Buttram, June 23, 2000.

5Interview with Sharron Havens, January 21, 2000.

6Interview with Sharron Havens, January 21, 2000.

CSRD
Previous Page Next Page
     

SEDL online accessibility

© 2000 Southwest Educational Development Laboratory