Research and Evaluation Technical Support

SEDL provides research and evaluation technical support to assist policymakers and educators in understanding the impact of school improvement efforts. Similar to the technical assistance we provide in other content areas (e.g., literacy, math, STEM), we facilitate knowledge building in evaluation skills and tools. Our approach consists of modeling evaluation or data-related knowledge and skills, scaffolding clients as they begin to perform the activities, then monitoring and providing feedback and documentation to clients on their performance until they reach the desired level of competency.

Services We Provide

SEDL provides technical support services for increasing the capacity of educators and organizations to conduct research and evaluation. These services include the following:
  • Program selection criteria
  • Logic models/program descriptions
  • Evaluation and data collection plans
  • Performance management systems
  • Data collection strategies and systems
  • Data collection instruments
  • Statistical design methodology
  • Project management strategies
  • Data analysis, interpretation, and reporting
  • Policy/program implications & recommendations
  • Requests for proposal (RFP)

Success Story

“Georgia educators are changing the way they work. With SEDL’s help, they can better measure a program’s effectiveness throughout the entire process and change course when necessary.” — Erin McCann, Project Director, Research and Evaluation, SEDL

Expanded State Capacity to Evaluate School Improvement Programs

Division of School Improvement, Georgia Department of Education
Kristy Kueber and Kathy Carrollton, program managers in the Division of School Improvement at the Georgia Department of Education, are believers in the power of evaluation to steer programs and keep them on track. The two women worked with SEDL to learn our evaluation process for planning, guiding, and assessing school improvement programs. The team road-tested the process with the Georgia Thinking Maps™ initiative, which involved some 40 schools. By indicating up-front exactly what implementation should look like, the process enabled the team to identify issues, make adjustments, assess the extent to which interim goals were being met, and track progress toward long-term goals. Kueber and Carrollton were so pleased with the results that they adopted the evaluation process for all their division's programs. Read more

Significant Work

Southeast Comprehensive Center (SECC): Research and evaluation (R&E) staff provide technical assistance to SECC state education agencies in Alabama, Georgia, Mississippi, North Carolina, and South Carolina to build their capacity to select and implement research-based school improvement programs and conduct evaluations on those programs. In addition, R&E staff assist educational agency staff in developing their expertise in data-related skills and consult on data issues such as statistical design, data collection systems, and ensuring project cohesiveness. Work varies by state.

  • Alabama – Instructional Partners Program: R&E staff are assisting the Alabama Department of Education with formative evaluation, data collection, and data analysis for the Instructional Partners Program, including the interpretation and use of data to refine the program and evaluation plans during the scale up of the initiative.
  • Mississippi – School Improvement: R&E staff are providing support to Mississippi School Improvement Grant (SIG) schools in sustaining effective practices. Support consists of developing reflective tools and co-facilitating workshops for district and school leaders.
  • South Carolina – School Improvement: R&E staff are assisting the South Carolina Department of Education (SCDE) in the use of an online system for monitoring and reporting implementation progress of school improvement plans. SCDE staff are also learning to use this system as a continuous improvement process for districts and schools. Support consists of developing progress indicators, monitoring tools, accountability plans, and professional development for SCDE staff on strategies for supporting districts and schools.
  • Multistate – Educator Evaluation Systems: R&E staff are assisting several SECC states in developing and implementing educator evaluation systems, with the end goal of elevating state capacity to identify, recruit, develop, and retain highly effective educators. Support ranges from lending technical expertise on complex evaluation metrics to developing communication strategies to engage stakeholders as active and informed participants.

Texas Comprehensive Center (TXCC): R&E staff provide technical assistance to the Texas Educational Agency (TEA) and the state’s 20 regional education service centers through the TXCC to build their capacity to select research-based school improvement programs and conduct evaluations on the impact of those programs. R&E staff assist TEA and educational service center staff in developing their expertise in data-related skills and consult on data issues such as statistical design, data collection systems, and ensuring project cohesiveness.

  • Texas Accountability System for Educator Preparation Programs (ASEP): To support TEA in developing the Texas ASEP, R&E staff are providing technical expertise on survey methodology, including survey development and validation, data management, analysis, interpretation, and use. R&E staff are also facilitating the involvement of staff from educator preparation programs in the development of ASEP metrics and serving as thought partners with TEA in efforts to streamline data collection and sharing.
  • Educator Evaluation System: R&E staff are working in an advisory capacity with TEA on educator evaluation metrics. Areas of focus include challenges related to observation practices, methodological complexities and tradeoffs associated with implementing robust educator evaluation systems, and considerations of research designs/questions.

Cullman County School System: R&E staff are providing technical assistance to the Cullman County School System in using the Concerns-Based Adoption Model (CBAM) to facilitate a school improvement initiative. CBAM consists of a set of tools for formatively assessing implementation progress. These tools, when combined with other evaluation tools, will provide a comprehensive picture of the effectiveness and impact of the initiative. The SEDL team is collaborating with Cullman County School staff to develop their expertise in combining formative and summative evaluation methods and tools to achieve their school improvement goals.

Girlstart Evaluation: SEDL is providing evaluation support to Girlstart, a Central Texas informal education program dedicated to empowering and equipping K–12 girls in science, technology, engineering, and math (STEM).

UTeach STEM Planning Grant
: R&E staff are collaborating on a planning grant to develop research design options and methodological approaches to evaluate the efficacy of the UTeach STEM teacher preparation program at the University of Texas at Austin. R&E staff are recruiting national advisors with a range of expertise to advise and assist with the work.

Show More Work

Resources

Image of publication cover Using Formative Assessment to Improve Student Achievement in the Core Content Areas: Southeast Comprehensive Center Briefing Paper, January 2012
Since 2001, federal laws such as the Elementary and Secondary Education Act (ESEA) and the Individuals with Disabilities Education Act (IDEA) 2004 have made raising student achievement standards the center of our national conversation. Consequently, educators have increasingly turned their attention to exploring the potential of formative assessments as one approach to increasing student outcomes in order to meet federal and state accountability requirements.

Click here to view all free resources or products for sale in the area Research and Evaluation Technical Support.