SEDL Home Southwest Educational Development Laboratory
     
  Strategies for Success: Implementing a Comprehensive School Reform Program
Previous Page Next Page
 

Introduction

Introduction

 
   

Distributing dollars to schools is an important part of supporting instruction and improving student learning. Financial resources pay teacher and administrator salaries, provide equipment and supplies for classrooms, fund professional development, and pay for buildings and their upkeep. Educators, as well as policymakers and researchers, want to know what level and mix of expenditures are most likely to produce high academic performance for all students. More than two decades of research studies address this issue of optimal resource allocation in education, but researchers have not solved the problem. One possible explanation as to why this issue continues to elude researchers is the nature and complexity of the problem. It is difficult to attribute an increase in student learning to any one factor because so many forces influence student learning, including factors outside the school environment. Additionally, an increase in expenditures may take years to result in higher student performance, at which time it becomes difficult to demonstrate a causal relationship between the resources and improved performance. Despite these difficulties, numerous researchers and educators believe this type of research should continue because it may lead to more effective resource allocation to achieve the goal of high academic performance for all students.

Many researchers have struggled to understand the precise relationship between education spending and student performance. Among recent studies that link education resources and student performance two patterns emerge. One pattern shows a steady increase in federal, state, and local resources for education. The other pattern reveals generally weak increases in student achievement as measured by standardized tests. Researchers have failed to reach consensus about what these findings mean, and they have been hampered in their efforts by poor or inconsistent data sources as well as arguments about what constitutes appropriate research methodology. This study attempts to contribute new information to the dialog about the relationship between education expenditures and student performance by using more recent data and multiple methods of analysis.

The purpose of this study is to gain a better understanding of the relationship between resource allocation in Texas public school districts and student academic performance. The study explores the relationship between academic performance levels and expenditures, including expenditures on certain types of educational programs. The study also examines the expenditure patterns of districts that had improvement in performance over a three-year period. Researchers used Texas school finance data available on the Internet as one resource for studying state and district expenditure patterns. They requested PEIMS data from the Texas Education Agency. Information about student performance in Texas was also accessed through the Internet. To gather information about the dynamics of school district resource allocation, researchers conducted interviews with administrators in twenty-one school districts.

Researchers used financial and accountability data for 1,042 school districts for three years: 1996-97, 1997-98 and 1998-99. Researchers created a subset of 774 target school districts using the state data set. The target districts had three years of finance data as well as three years of student performance data as measured by the Texas accountability system. In addition, researchers isolated twenty-one school districtsæreferred to as focus districtsæfor in-depth study. They had complete financial and performance data, and the districts agreed to participate in interviews about budgeting, salary and program costs, and financial incentives for improved student academic performance. Finally, researchers created a data set of financial and student performance data for nine districts with exemplary accountability ratings.

With these data sources, researchers sought answers to three questions. What is the current pattern of resource allocation among Texas school districts as measured by expenditures? Do Texas school districts with higher levels of student performance allocate resources in distinctive or unique ways? And, how do school administrators characterize their budget and resource allocation decisions, and do these characteristics differ between school districts that are high performing and those that are not?

Methodology

Researchers gathered data from the Texas Education Agency, from data bases available on the Internet and from interviews of public school officials familiar with school finance and resource allocation. The quantitative or financial data permitted researchers to conduct comparative studies of resource allocation and examine the relationship of spending to student academic performance. The qualitative or interview data provided a context for understanding how school districts make budget decisions to address instructional needs, and how districts seek to stimulate improved student academic performance using financial resources.

Researchers constructed five data sets for use in the analysis.

  • State-level data from PEIMS were used to describe the allocation of public school expenditures among functions and programs on a statewide basis.

  • Researchers identified 774 target districts that could be organized into three levels related to student performanceælevel three districts have the lowest relative performance, level two districts have the next highest performance, and level one districts have the highest overall performance. Once researchers grouped districts by performance level, they had information to compare resource allocations among the levels and to examine patterns or relationships between performance and resource allocation.

  • Researchers identified seven districts in each of the three levels created for the target district data set. These twenty-one districts are referred to as focus districts. Data on expenditure functions and programs for focus districts were analyzed to identify relationships between resource allocation and student academic performance.

  • Researchers identified nine school districts that moved from accountability ratings of acceptable in 1996-97 to exemplary in 1998-99. These districts are referred to as strong-improvement districts. Researchers explored expenditures in these districts and compared the findings to results obtained from the analysis of focus districts and target districts.

  • Researchers interviewed administrators in the twenty-one focus districts to learn more about general funding, salary costs, other staffing costs, professional development, special program costs, and financial awards related to improved student performance. Researchers transcribed interview data for use with software tools for qualitative analysis.

State-Level Data

The Texas Education Agency provided researchers with expenditure data from the Public Education Information Management System (PEIMS) for three yearsæ1996-97, 1997-98 and 1998-99. These data reflect actual (not budgeted) expenditures from all funds, reported by school districts at the end of the fiscal year. Researchers used PEIMS expenditure data organized by function and program intent.

Expenditure function. School expenditure functions are general categories of expenditure. For this study researchers selected ten function codes: instruction, instructional resources, staff development, instructional leadership, school leadership, guidance and counseling, social work, co-curricular and extracurricular activities, and general administration. All other function codes were aggregated to create a tenth function category labeled “other.” A description of the types of expenditures represented by the functions appears below. Appendix B presents a more detailed description of the codes.

Function Codes:

  • Instruction—classroom teachers and teacher aides.

  • Instructional resources—librarians, library books, videos, software, resource center personnel.

  • Staff development—staff who prepare and/or conduct in-service training or staff development for instructional staff.

  • Instructional leadership—instructional supervisors, special population program coordinators or directors, and other educational program coordinators or directors.

  • School leadership—principals, assistant principals, and related staff.

  • Guidance counseling—counselors and related staff; staff who research and evaluate the effectiveness of programs.

  • Social work—truant/attendance officers, social workers, personnel transferring migrant student records.

  • Co-curricular and extracurricular activities—Salary supplements for coaches, athletic directors, athletic supplies and equipment, band uniforms, sponsors for UIL speech, debate, science competitions, etc.

  • General administration—salaries related to the superintendent, budgeting, and human resources; salaries associated with planning and research.

  • Other transportation, facilities and plant maintenance, security and monitoring, community services, data processing, and other functions.

Program intent. School districts report expenditures according to the program or activity they are intended to support. Program intent expenditures show researchers how school districts plan to fund regular education, gifted and talented education, career and technology education, special education, compensatory education (sometimes referred to as “accelerated instruction”), bilingual education, and athletics and related co-curricular education. A description of program intent codes appears below.

Program Intent Codes:

  • Regular education—services directed toward basic, regular education students; includes honors and college preparatory courses

  • Gifted and talented education—services directed towards students participating in a state-approved gifted and talented program

  • Career and technology education—services directed towards students participating in a state-approved career and technology education course as an elective, as a participant in the district’s career and technology coherent sequence of courses program, or as a participant in the district’s tech prep program.

  • Special education—services directed towards students participating in special education programs.

  • Compensatory education—services directed towards increasing the amount and quality of instructional time for students in at-risk situations.

  • Bilingual education—services directed towards students participating in a state-approved bilingual education programs which is a full-time program of dual-language instruction.

  • Athletics/related education—costs for co-curricular/extracurricular activities.

Performance rating. Texas has an accountability system for rating school districts that incorporates information gathered from student attendance, dropout rates, and test scores. Tests used within the accountability system are called the Texas Assessment of Academic Skills (TAAS). TAAS tests in reading, writing, and mathematics are aligned with Texas learning standards that describe what students should know and be able to do. The state combines attendance rates, dropout rates, and TAAS performance in a rating system that produces a designation of exemplary, recognized, acceptable, or low performing for each Texas school district. Charts that summarize the standards for ratings appear in Appendix A of this report.

Researchers added a variable to each school district record to indicate the district’s accountability rating in 1996-97, 1997-98, and 1998-99. For this study districts rated exemplary were assigned the numeral 1, districts rated recognized were assigned the numeral 2, districts rated acceptable were assigned the numeral 3, and districts rated low performing were assigned the numeral 4. Many districts had ratings that varied from year to year. For example, a district might have a rating of 2 for 1996-97, and a rating of 1 for 1997-98 and 1998-99. Researchers were unable to assign every district a rating for each of the three years because of missing data or reporting problems.

Target District Data

Researchers examined district performance ratings for 1996-97, 1997-98, and 1998-99 and created a composite or average score for each district for which there were three years of accountability ratings. The composite score is the sum of the assigned numeric designations for three years. For example, a district with a rating of 2 each year had a composite score of 2. A district with a rating of 2 in 1996-97 and 1997-98 and a rating of 1 in 1998-99 had a composite score of 1.66 (5 divided by 3). Lower composite scores reflect higher student performance levels. A district with a composite score of 4 would have been low performing for all three school years and a district with a composite score of 1 would have been exemplary for all three school years. Once the composite scores were constructed and assigned to each school district, researchers divided the districts into three levels based on the following criteria:

Level one districts are defined as those with

  • A composite score of 2.0 or lower

  • A 1998-99 accountability rating of recognized or exemplary

Level two districts are defined as those with

  • A composite score greater than 2.0 and less than or equal to 3.0

  • A 1998-99 accountability rating of acceptable

  • A 1998-99 TAAS passing rate for all students in the top 60 percent of districts with composite scores between 2.0 and 3.0 and with an accountability rating of acceptable

Level three districts are defined as those with

  • A composite score of 3.0 or greater

  • A 1998-99 accountability rating of acceptable or low performing

  • A 1998-99 TAAS passing rate for all students in the bottom 20 percent of districts with composite scores of 3.0 or greater and with an accountability rating of acceptable or low performing

Seven hundred and seventy-four (774) school districts met one of the three definitions. This group of 774 districts is called the target districts: 283 are at level one, 320 are at level two, and 171 are at level three. The purpose for identifying a level associated with student performance is to aid in the comparison of resource allocations and to assist researchers in determining whether student academic performance is related to resource allocation.

Focus District Data

Researchers identified twenty-one districts for further studyæseven districts from each of the three levels constructed for the target district data set. Selection was based on four criteria: campus performance, geographic distribution, student demographic profile, and willingness to participate in interviews. Researchers looked first to individual campus performance to select districts for further study. Within level one they selected districts that had at least one-third of their campuses with a rating of recognized or exemplary. They selected level two districts that had 33 percent or fewer of their campuses rated recognized or exemplary. They selected level three districts that had no schools with an exemplary rating. Once schools in each of the three levels were selected, researchers chose districts to represent all Texas regions. When researchers completed the selection of districts according to campus performance and geography, they chose districts that represented a range of demographic profiles. Finally, they contacted the superintendent and business office of each selected district to secure permission to interview, with a goal of having twenty-one focus districts equally distributed among levels one, two, and three.

Strong-Improvement District Data

Researchers examined the state-level data set and identified school districts that had upward or positive accountability ratings changes. They selected nine districts that had accountability ratings of acceptable in 1996-97 and exemplary ratings in 1998-99. These districts are referred to as strong-improvement districts.

Interview Data

Researchers constructed interview questions covering six broad topic areas: general funding, salary costs, other staffing costs, professional development, special program costs, and fiscal awards related to student academic performance. Questions under these topic areas were divided into separate protocols that could guide interviews with finance officers, personnel directors, and superintendents of small districts. Researchers pilot-tested the interview questions with two school districts that were not included among the focus districts. Next, the questions were refined with the assistance of an educational finance consultant from the private sector. Appendix C presents the interview questions.

Researchers arranged to conduct interviews at the focus district sites. In three small districts only the superintendent was interviewed because the superintendent served as both the finance officer and personnel director. In the remaining eighteen districts, finance officers and personnel directors were both interviewed. All but three interviews were completed individually. Researchers conducted the remaining interviews using a group discussion format. Interviews took from 45 to 90 minutes to complete. Researchers taped interviews and later transcribed them for analysis with software appropriate for analyzing interview data.

  Strategies for Success: Implementing a Comprehensive School Reform Program
Revious Page Next Page
 
   
Copyright 2000 Southwest Educational Development Laboratory   Web Accessibility Symbol