Stories from the Field
Colorado 21st Century Community Learning Centers
Even the process of developing evaluation tools can involve monitoring your work and using results for continuous improvement. Just ask the team that developed the evaluation tools for the Colorado 21st Century Community Learning Centers (21st CCLCs). The evaluation includes progress reports to determine continued funding, the 21st CCLC Profile and Performance Information Collection System that all 21st CCLC grantees are required to complete, a quality improvement/monitoring tool, and focus groups.
Launched this year, the quality improvement/monitoring section is the newest component of the evaluation structure. In developing the tool, the team sought input from afterschool staff, program evaluators, afterschool advisors, and outside experts. While they were working on the various drafts of the tool, evaluation leaders also presented it to grantees, giving them the opportunity to provide feedback and express any concerns they had. Finally, they pilot-tested the tool at three sites. According to Joy Fitzgerald, an external evaluator who helped develop the tool, “The feedback of those who participated in the pilot monitoring visits was invaluable in helping us fine-tune both the format of the . . . tool and the processes for its use by programs and monitoring teams.”
Although sites use this tool to prepare for external evaluations, it also helps them create high-quality programs. Leaders and key staff can use it to assess, plan, design, and implement strategies for ongoing program improvement. The tool includes a worksheet on which afterschool leaders can note strengths and priorities for improvement. “This plan provides a structure to help grantees consider how improvement priorities will be enacted—through what activities, by whom, using what resources, and on what timeline. In addition to promoting quality improvement, the self-assessment process provides program partners and collaborators with a common structure for comparing their perceptions and identifying concerns as they work together,” says Fitzgerald.
As they introduce their new evaluation tool and begin site visits in April, the evaluation leaders will also model evaluation and continuous improvement for the sites they visit. They intend to collect feedback from users, make notes of their own experiences, and study emerging research on evaluation and then use this information to improve the quality improvement/monitoring tool they have developed.