The National Center for Quality Afterschool

Helping local practitioners and state education agencies develop high-quality and balanced programs

About the Center

Additional Resources

Articles of Interest

From Compliance to Quality: How to Make Evaluation Work for Your Program

Evaluation. If you work in afterschool, you probably hear the word with great frequency. Many afterschool programs are required to evaluate their program or bring in an external evaluator to review their work and determine if the program is in compliance with the terms of the grant.
Although it’s a start, simply completing an evaluation does not ensure a high-quality program. As we at the National Partnership for Quality Afterschool Learning continue our research and training, we urge afterschool professionals to think beyond using an evaluation to show compliance to using it to build high-quality, sustainable afterschool programs. High-quality programs have clear program goals, undergo regular evaluations to determine if they are meeting program goals, and reset their goals based on these evaluations.

Program evaluation is too broad of a topic to cover in one newsletter (we’ll cover other aspects of this topic in April), but we hope to steer you toward some guidelines, resources, and examples that will change your view of program evaluation from a report completed every year to secure funding to an incredible tool that will set your program in a pattern of continuous improvement. The steps below will help ensure that you get the most out of a program evaluation.1

Plan early. You probably know what your program goals are, whether they include character development or improved math skills. Decide early what data you will need to determine if you are reaching those goals. “Evaluation planning comes at the beginning, not the end,“ explains Priscilla Little of the Harvard Family Research Project, an organization that promotes educational success and includes evaluation as one of its main areas of research. “It’s not about sitting down at the end and trying to determine what you’ve accomplished.”2

Decide what you want to know about your program. If you are required to complete an evaluation or report, identify the information that is the most useful for program improvement. This should be the starting point, however. Ask yourself what else you need to evaluate to have a high-quality program. If you are trying to assess and improve relationships with students’ families and your evaluation is limited to data like students’ standardized test scores and grades, you’re not going to get the information you need. Parent surveys and reports on family participation would be better sources of data.

Get all of the stakeholders involved. It may be tempting to limit involvement of stakeholders to the people with money (i.e., grant administrators or potential funders), but the list of people who should be included in planning an evaluation is much longer. Stakeholders can include parents, youth, afterschool and regular day-school staff, and community members, to name a few.

Little encourages afterschool leaders to allocate time and money for staff input during the evaluation planning process. This includes letting them voice any concerns and also adding questions to the evaluation that reflect staff’s interests in the program.

Use what you learn. Celebrate and share any positive results in your evaluation, and then roll up your sleeves and decide how you want to use the information. Again, involve stakeholders and decide what changes you want to make as a result of your evaluation. The leaders of one afterschool program noticed that attendance rates at one site were significantly lower in the fall. A few inquiries revealed that the drop occurred because the program’s hours conflicted with those of fall sports, and students were opting for the latter. The leaders redressed the situation by adjusting the program’s hours and collaborating with some of the athletic programs to include them as afterschool activities. Staff from another afterschool program we talked to determined from the staff surveys in their evaluation that they needed to provide more staff development opportunities. Yet another program realized that they needed to get more input from parents if they were going to increase family involvement in the program.

When we showcase afterschool programs, we like to point out the student learning that occurs. Applying what you learn from a program evaluation is a way to show the learning and improvement that occurs among your program leaders as well.

1 Huang, D. (2006, May). Preliminary findings from promising practices site identification for the 21st Century Community Learning Centers. SEDL Letter, 18(1), 9–14.

2 Little, P., DuPree, S. & Deich, S. (September 2002.) Documenting progress and demonstrating results: Evaluating local out-of-school time programs. Issues and opportunities in out-of-school time evaluation series. Cambridge, MA: President and Fellows of Harvard College and The Finance Project. For more information about the Harvard Family Research Project, see http://www.hfrp.org/.

Case Study

Colorado 21st Century Community Learning Centers

Even the process of developing evaluation tools can involve monitoring your work and using results for continuous improvement. Just ask the team that developed the evaluation tools for the Colorado 21st Century Community Learning Centers (21st CCLCs). The evaluation includes progress reports to determine continued funding, the 21st CCLC Profile and Performance Information Collection System that all 21st CCLC grantees are required to complete, a quality improvement/monitoring tool, and focus groups.

Launched this year, the quality improvement/monitoring section is the newest component of the evaluation structure. In developing the tool, the team sought input from afterschool staff, program evaluators, afterschool advisors, and outside experts. While they were working on the various drafts of the tool, evaluation leaders also presented it to grantees, giving them the opportunity to provide feedback and express any concerns they had. Finally, they pilot-tested the tool at three sites. According to Joy Fitzgerald, an external evaluator who helped develop the tool, “The feedback of those who participated in the pilot monitoring visits was invaluable in helping us fine-tune both the format of the . . . tool and the processes for its use by programs and monitoring teams.”

Although sites use this tool to prepare for external evaluations, it also helps them create high-quality programs. Leaders and key staff can use it to assess, plan, design, and implement strategies for ongoing program improvement. The tool includes a worksheet on which afterschool leaders can note strengths and priorities for improvement. “This plan provides a structure to help grantees consider how improvement priorities will be enacted—through what activities, by whom, using what resources, and on what timeline. In addition to promoting quality improvement, the self-assessment process provides program partners and collaborators with a common structure for comparing their perceptions and identifying concerns as they work together,” says Fitzgerald.

As they introduce their new evaluation tool and begin site visits in April, the evaluation leaders will also model evaluation and continuous improvement for the sites they visit. They intend to collect feedback from users, make notes of their own experiences, and study emerging research on evaluation and then use this information to improve the quality improvement/monitoring tool they have developed.

http://www.cde.state.co.us/cdecomp/21stCentury.htm

Read this story in the March 2007 issue of AfterWords.

Online Training for Afterschool Staff
The Afterschool Training Toolkit is available online free of charge.

The following resources can be used with the online Afterschool Training Toolkit to give you the resources you need to build fun, innovative, and academically enriching afterschool activities.