SEDL INSIGHTS, Vol. 1, No. 2 (Summer 2013)
Managing the Implementation of School Improvement Efforts

by Jason LaTurner, PhD, and Dale Lewis, PhD
For a school improvement initiative to succeed, education leaders must do more than adopt a new program and train staff. This issue of SEDL Insights explores steps that leaders can take to ensure the successful implementation of a new program or practice.

SEDL Insights on Managing the Implementation of School Improvement Efforts

  1. Don’t just adopt a new program; implement it.
  2. Understand that change is personal.
  3. Define the change.
  4. Use data before, during, and after implementation.
  5. Commit for the long haul.

The push for college and career readiness for all students, educator evaluations tied to student growth, and the turnaround of our lowest-performing schools has resulted in a myriad of new programs and practices aimed at improving student achievement. Many of these efforts will fail to produce the desired results. This failure is not necessarily because the program or practice was inherently flawed—although there are plenty of programs with scant evidence of effectiveness—but because those charged with overseeing the improvement effort were unable to effectively manage the implementation process.

At SEDL, we have experienced the implementation of school improvement efforts from three unique perspectives: as those leading the effort, as those charged with implementing the new program or practice, and as consultants and evaluators for others who are managing the implementation. Based on our experiences and a review of the research on the topic, we have identified five key insights on managing implementation. Though focused primarily on leaders and other facilitators of change initiatives, these insights also provide guidance for anyone who has experienced the rollout of a new program or school improvement initiative.

Insight 1  

Don’t just adopt a new program; implement it.

Many educators have witnessed something like this when a new program is adopted: The district leadership team decides to provide tablet computers for all teachers, devoting significant time and resources to selecting the devices. They purchase the tablets and then hire a technology expert to provide a 1-day training session for teachers. Once the training session is over, teachers return to their classrooms, tablets in hand. The leadership team, busy with other responsibilities, assumes teachers are using the tablets.

A few months later, a curriculum specialist asks whether students are benefiting from the presence of the new technology in the classroom. Members of the leadership team realize that they don’t know if or how teachers are using the tablets. After some investigation, the leadership team learns that some teachers have not used the tablets since the training at the beginning of the school year. Others have taken the tablets home and are using them primarily for recordkeeping and administrative tasks. Others are letting students play games on the tablets as a reward for completing work or for good behavior.

There is a sense of disappointment among the leadership team. What went wrong? The leadership team focused on adoption instead of implementation. Adoption of a new program, and the corresponding training, are the first steps in the longer process of implementation. Adoption and training are important parts of the process, but on their own they do not ensure implementation.1 When we work with districts and schools that are initiating change, we ask them to think beyond “adopt and train” to how staff will use the initiative.

Insight 2  

Understand that change is personal.

For a new program to have the expected impact, leaders managing the effort must address the concerns of the people charged with implementing it. Staff may respond to an initiative in a variety of ways, from enthusiasm to stress. Those who are less comfortable with an innovation will express concern about how the innovation will affect them personally. Those who are more comfortable with and skilled in using an innovation will focus on broader impacts, such as how the initiative will affect their students or their working relationships with colleagues.2

In the same way that teachers monitor and respond to the needs of their students, leaders should assess for and assist with the needs of their staff, facilitating and guiding them in their professional growth. They can begin simply by checking in with the individuals who are charged with implementation to ensure they not only understand the expectations but are also comfortable with implementing something new. Through surveys and interviews, they can get a snapshot of staff concerns and use the data to determine what support to provide.3 For example, if teachers are observed having difficulty using formative assessment data during the course of instruction, they may benefit from observing others using such data in practice. They may also benefit from coaching and guidance in how to make adjustments based on such feedback. In providing support—which may include coaching, consulting, or follow-up actions such as small-group instruction or guidance—the leader is also able to communicate encouragement and genuine concern for the individual or group and help advance the change effort. Such a facilitative style also contributes to the development of a culture and context that is conducive to and supportive of the change process, and it reflects the leader’s commitment to supporting others in their journey toward improvement.

For example, SEDL worked with a district that was implementing professional learning communities (PLCs) at several schools. As part of the PLCs, teachers first met in groups to plan a lesson that was aligned with state standards. They taught the lesson and then brought samples of student work to PLC meetings so the group could check for student understanding and identify ways to modify instruction to better meet student needs. School leaders surveyed staff concerns and realized that some of the teachers felt uncomfortable discussing student difficulties with their PLCs. These teachers were worried that colleagues might criticize them for their students’ difficulties or, worse, that the challenges would be noted in their performance reviews.

In response to these concerns, instructional coaches worked with each PLC to develop meeting guidelines that ensured a supportive and respectful environment. Coaches also modeled the behavior they wanted to see in the meetings: they recognized and celebrated successes, helped teachers use work samples to identify student difficulties and find solutions, and they offered support and solutions instead of criticizing teachers. As teachers felt more comfortable in PLC meetings, they were more willing to ask colleagues for input when their students struggled to master a specific standard. Teachers appreciated the helpful feedback and felt that their instructional practices—and ultimately student achievement—benefited. By understanding and responding to the way teachers were experiencing the change, school leaders helped ensure that PLCs were successfully implemented.

Insight 3  

Define the change.

Even when they are enthusiastic about a new program, staff may return from training and realize that they still don’t understand what is expected of them. We encourage schools and districts to provide staff with a clear, specific, and shared description of what implementation of a new program or practice should look like. This description should look at implementation as a range of behaviors—including ideal, acceptable, and less desirable variations—rather than as implementing versus not implementing. Finally, it is helpful to define the components of the program or innovation. These may include materials used, teacher behaviors, and student activities.4

For example, a school implementing a new science program might want to define how teachers are expected to group students for learning. An example of ideal implementation might be the teacher assigning students to groups that vary over time based on instructional objectives and students’ abilities. Acceptable implementation might be the teacher assigning students to small permanent groups for lab work and other assignments. Less-than-desirable implementation might be the teacher exclusively providing whole group instruction.

Getting input from everyone involved in the program is one of the most valuable aspects of the process of defining the change. A district in Alabama engaged in a collaborative, rather than a top-down, process to define change and felt that the effort contributed to the program’s success. The district was implementing the state’s Strategic Teaching Framework, an approach that focuses on standards, lesson planning, and instructional strategies. The district had provided training on the framework, but there had been very little follow-up, and implementation varied among schools. In response to the situation, district leaders and teachers discussed and dissected the program, defining in greater detail what all stakeholders should be doing when they implemented the Strategic Teaching Framework in the classroom. The group then presented the description to school principals. The result of this collaboration was a clear and thorough description of the program that helped all staff understand expectations. Because stakeholders at all levels had helped define the change, they felt more ownership and became advocates for the framework. Finally, because they had a clear idea of how teachers should be using the Strategic Teaching Framework, administrators charged with overseeing implementation said that when they conducted classroom walk-throughs, their visits had greater focus and purpose.

Insight 4  

Use data before, during, and after implementation.

A popular adage among those who work in research and evaluation is “you can never have too much data.” Perhaps a more apt statement would be “you can never have too much relevant data,”—data that will help leaders choose the appropriate program, determine how staff are implementing the program, and, ultimately, what impact the program is having on students. In other words, leaders must plan to collect data before, during, and after program implementation.5

Before a district or school adopts and implements a new program, wise leaders should collect and analyze data to (1) determine the extent to which there is a need for a new program and (2) set realistic goals for addressing that need. Consider student performance in mathematics as an example. Before a district selects a new math curriculum or professional development service, leadership should assess current student achievement. Is there even a need for the program? If so, is there a need for the program in all schools in the district or just some? Are there specific levels of student performance that the district would like to see as a result of implementing the program? The next set of pre-adoption data should focus on the programs under consideration. For example, is there research showing the program is effective? If so, is it effective for all students or only certain groups of students? Does the program have the potential to increase student performance to the desired level? This fine-tuned data analysis of both student performance and available programs will better prepare decision makers to choose innovations that can best meet their students’ needs.6

In an ideal setting, staff would implement a new program without any problems, and the new program would soon have a positive impact on student outcomes. But we know that this rarely, if ever, occurs. Collecting and analyzing formative data during implementation allows leadership to determine what is going well with implementation and what areas may need support. For example, one of our evaluators worked for a district that had adopted a behavioral-support program through which educators aimed to replace negative student behaviors with more constructive behaviors, thereby eliminating the need for negative reinforcement. Behavioral-support specialists played a critical role in the program’s success through the training and support they provided to teachers and staff. The district team therefore wanted to ensure that the specialists were indeed providing the necessary support. The team reviewed service logs to track both the number of hours behavioral-support specialists spent working with teachers and staff and the resources they were providing. Through this collection and analysis of formative data, the leadership team saw that some specialists were not providing the expected level of support and materials. The team reached out to these specialists to see what prevented them from providing more support. In many cases, the behavioral-support specialists faced challenges such as scheduling difficulties. The district leadership worked with them to resolve these issues so that the support specialists could work with teachers more effectively and ensure the program’s success.

Data collected after a program has been implemented for a set amount of time, or perhaps at the end of a school year, is typically used in a summative way to determine the program’s impact. With the math program described earlier, district leaders might examine student math scores on benchmark tests to determine whether the new program had the desired impact on student achievement in mathematics.

Too often, we see education leaders seeking quick fixes—and therefore abandoning a recently adopted program—because they are unhappy with the first post-implementation data. As noted earlier, when an initiative fails to have the desired impact, the program or practice itself is not always to blame. A thorough review of data may provide insights on how the program was implemented and highlight opportunities to support the staff engaged in the effort. When leadership teams reflect on a program’s success, we encourage them to consider data collected before, during, and after program implementation. Did the program truly meet the school or district’s needs? Did staff implement the program as expected? Finally, did students benefit? Data can also guide the leadership team in making decisions about the continuation of, or changes to, the program.

Insight 5  

Commit for the long haul.

The insights listed so far have reinforced the notion that leaders drive school improvement, especially in building professional community and teacher capacity.7 For example, we have seen a school struggle to implement a co-teaching initiative that resulted from grassroots concern about high school students with disabilities having access to rigorous, grade-level instruction. The initial excitement and momentum for this initiative faltered and perished, in part because the principal showed little interest in it, chose not to take part in training with the team, and delegated responsibility for the effort to non-leadership personnel.

Conversely, change initiatives have succeeded as a result of a leader’s ongoing support and interest. We have seen PLCs flourish in a middle school where the principal participated in team meetings, used newsletters and faculty meetings to publicly acknowledged the efforts and progress of teacher teams, and collaborated with staff to apply for and emerge as a finalist for a national award recognizing collaborative professional learning. More recently, in our work with a geographically large, rural school district implementing Alabama’s Strategic Teaching Framework, we have seen central-office staff respond effectively to pushback from school leaders. When principals questioned the direction of the approach and expressed their need for more support, district leadership reviewed their concerns and identified problems with the initial launch of the project. The district provided a professional learning opportunity to build common purpose and vision for the work ahead.

Conclusion

Even when educators are motivated to improve instruction and student achievement, implementing the changes required to produce these outcomes can be challenging both for education leaders and for those charged with implementation. People are often tempted to abandon the program at the first sign of failure. Managing the implementation of a school improvement initiative requires leaders to do more than adopt a new program and train staff. Education leaders will see better results if they think beyond these first steps and view implementation as a dynamic, long-term process. By considering how staff may experience the change, clearly defining how the initiative should look when implemented, collecting and analyzing data to measure success and provide support, and committing to support the initiative beyond adoption, leaders increase the chance that school improvement initiatives will have a positive impact on student achievement.

How SEDL Can Help

The insights described above are based on a framework called the Concerns-Based Adoption Model (CBAM), a resource that SEDL staff use to help schools and districts manage the implementation of school improvement initiatives. CBAM provides tools and techniques for leaders to identify staff concerns and analyze program implementation. This information empowers leaders to give each person the support they need to achieve success.

  • An Innovation Configuration Map provides a clear picture of what constitutes high-quality implementation. It serves as an exemplar to guide and focus staff efforts.
  • The Stages of Concern process, which includes a questionnaire, interview, and open-ended statements, enables leaders to identify staff members’ attitudes and beliefs toward a new program or initiative. With this knowledge, leaders can take actions to address individuals’ specific concerns.
  • The Levels of Use interview tool helps determine how well staff, both individually and collectively, are using a program. Levels range from nonuse to advanced use. When combined with the Innovation Configuration and first-hand observations, this information can help staff effectively implement a new program.

More information about CBAM is available on our website at sedl.org/cbam. If you would like to learn more about how AIR can help you use CBAM to manage a school improvement initiative, visit the CBAM website on the American Institutes for Research site.

Footnotes

1 Hord, Rutherford, Huling, & Hall, 2006.
2 George, Hall, & Stiegelbauer, 2006.
3 Hall & Hord, 2011; George et al. 2006.
4 Hall & Hord, 2011; Hord et al, Hall, 2006; W. K. Kellogg Foundation, 1998.
5 Hall, Dirksen, & George, 2006.
6 Learning Point Associates, 2004.
7 Bryk, Sebring, Allensworth, Luppescu, & Easton, 2010.

References

Bryk, A. S., Sebring, P. B., Allensworth, E., Luppescu, S., & Easton, J. Q. (2010). Organizing schools for improvement: Lessons from Chicago. Chicago, IL: The University of Chicago Press.

George, A. A., Hall, G. E., & Stiegelbauer, S. M. (2006). Measuring implementation in schools: The Stages of Concern Questionnaire. Austin, TX: SEDL.

Hall, G. E., Dirksen, D. J., & George, A. A. (2006). Measuring implementation in schools: Levels of use. Austin, TX: SEDL.

Hall, G. E., & Hord, S. M. (2011). Implementing change: Patterns, principles, and potholes (3rd ed.). Upper Saddle River: NJ: Pearson Education.

Hirsh, S. (2011). Foreword. In G. E. Hall & S. M. Hord, Implementing change: Patterns, principles, and potholes (3rd ed., pp. xix–xxi). Upper Saddle River: NJ: Pearson Education.

Hord, S. M., Rutherford, W. L., Huling, L., & Hall, G. E. (2006). Taking charge of change. (Rev. ed.). Austin, TX: SEDL.

Hord, S. M., Stiegelbauer, S. M., Hall, G. E., & George, A. A. (2006). Measuring implementation in schools: Innovation configurations. Austin, TX: SEDL.

Learning Point Associates. (2004). Guide to using data in school improvement efforts: A compilation of knowledge from data retreats and data use at Learning Point Associates. Naperville, IL: Author.

Rogers, E. M. (2003). Diffusion of innovations (5th ed.). New York, NY: Free Press.

W. K. Kellogg Foundation. (1998). Using logic models to bring together planning, evaluation, and action: Logic model development guide. Battle Creek, MI: Author.


About SEDL Insights

SEDL Insights is based on the experience, expertise, and research of SEDL staff. It is designed to give education practitioners practical suggestions for implementing school improvement strategies.

About SEDL

SEDL is a nonprofit education research, development, and dissemination organization based in Austin, Texas. Improving teaching and learning has been at the heart of our work throughout our near 50 years of service. SEDL partners with educators, administrators, parents, and policymakers to conduct research and development projects that result in strategies and resources to improve teaching and learning. SEDL also helps partners and clients bridge the gap between research and practice with professional development, technical assistance, and information services tailored to meet their needs.

Copyright 2013 by SEDL
All photos are copyright Thinkstock.
You are welcome to reproduce issues of SEDL Insights and distribute
copies at no cost to recipients. Please credit SEDL as publisher.
SEDL Insights
Authors: Jason LaTurner, PhD, and Dale Lewis, PhD
Editor: Laura Shankland, MA, PMP
Designer: Shaila Abdullah

Published in SEDL Insights Vol. 1, No. 2 (Summer 2013), Managing the Implementation of School Improvement Efforts