Playing it Safe with School Safety Programs

Published in SEDL Letter Volume VIII, Number 2, August 1995, School Safety

Some school safety programs require additional investigation before implementation. These strategies may help educators choose wisely.

A quick survey of the recent literature reveals that school safety is a subject of great interest to educators. Several important publications on school safety have appeared, including the Harvard Education Review's special issue on youth and violence, Phi Delta Kappan's "Special Report on Youth Violence," and Educational Leadership's "Contemporary Issues: Violence in School."

Educators are not only reading about school safety, they are adopting school safety programs. One indication of the adoption boom is the number of schools implementing conflict resolution programs with peer mediation components. According to the National Association for Mediation in Education, an estimated 5,000 peer mediation programs are in schools today, compared to 100 five years ago.

Educators bringing safety programs into their schools can profit from first applying strategies to assess the value of these curricula. Though knowledgeable in their disciplines, many educators may need additional resources to review and choose judiciously from among the highly specialized school safety and violence prevention programs.

The truth is, studies reveal that some violence prevention programs promise outcomes that are unverified by research. "In fact," writes Marc Posner in The Harvard Education Letter, "researchers are beginning to question whether the most commonly used school-based programs for violence prevention and conflict resolution actually do what they are supposed to do."

In their review, What Works in Reducing Adolescent Violence, Patrick Tolen and Nancy Guerra of the University of Illinois-Chicago go one step further. They write:

Programs that train peers to serve as mediators of disputes and train youth in conflict resolution skills have become increasingly popular since the mid-1980s....However, despite the soaring popularity of this type of intervention at the elementary school, middle school, and high school levels, and a number of laudatory "testimonials" from teachers and other participants,...we could not locate a well-designed empirical study that evaluated behavioral outcomes with adolescents. Although peer mediation has an intuitive appeal, particularly in terms of reducing situational and interpersonal violence, its efficacy has simply not been determined (p. 34).

Yet school administrators can follow guidelines based on common sense when selecting school safety programs. "Administrators should act as smart consumers," advises Jack Lumbley, a SEDL evaluation associate who has assessed school programs for more than 20 years. Lumbley recommends that educators apply some of the strategies they use when weighing other purchases and call upon the expertise of others while assessing such programs.

Consider the source of the program, Lumbley says. Traditional publishers typically subject their materials to a rigorous testing sequence and release them only after appropriate outcomes have been demonstrated reliably. In contrast, some antiviolence programs have been created by organizations specializing in conflict resolution rather than in curriculum design. These organizations, many of which are nonprofit, may lack sufficient resources to subject their curricula to structured field testing by the public school students and personnel who will ultimately use them. And, Lumbley notes, programs released without proper preparation may not deliver the results one might expect.

Dan Kmitta agrees. He is a researcher investigating how schools are implementing conflict resolution programs and how best to measure the effectiveness of such programs. Kmitta is completing a three-year study of conflict resolution programs in 14 Cincinnati, Ohio schools.

If the program producer is a nonprofit organization specializing in conflict resolution strategies, program evaluation is "generally done in-house" when it is done at all, Kmitta says. Such an internal evaluation is bound to be less stringent than tests conducted by objective third-party observers; unintentional bias may skew results.

Some developers of conflict resolution programs counteract this potential problem by incorporating in their program a process evaluation component where participants rate the content. "That produces valuable information for practitioners," Kmitta says. "It does not give those of us in the field who are looking at a program much information about its efficacy."

Information about program effectiveness may also appear in training manuals and other support literature. Even so, administrators should contact the program developer and ask how and where the program was evaluated, Lumbley says. Find out how extensively the program has been tested and ask for written evidence supporting the purported outcomes.

Questions one could ask when interviewing a program developer include:

  • Did the evaluator have sufficient expertise in the subject area? An expert on peer mediation techniques, for instance, might not be a suitable reviewer of a program that teaches bias identification.

  • Were the tests actually conducted in a school or in a test environment that simulated a school?

  • If the program was tested in a school, what was the school like? Were conditions comparable to those in the administrator's own school? Program outcomes at a sub-urban school may be very different from those at a rural school or those at an inner city school.

  • Who presented the program? The choice of trainers and their knowledge of program materials can affect program outcomes dramatically. "The same program targeting similar audiences may have an entirely different outcomes when presented by different implementers," Lumbley says.

Finally, check if the product's marketing literature contains published evidence of its effectiveness, Lumbley says. Good marketing literature should clarify the purpose of the program while providing evidence that the promised outcomes have been achieved.

Lumbley identifies another pitfall in program selection. Administrators might assume that a program on a "recommended curriculum list" has already been reviewed, particularly a program recommended by a state board of education. The same pitfall can occur with any blanket recommendation: a program that works in one school may fail or create problems in another.

The program should be designed to serve the specific student population in the school. Crediting Johns Hopkins University researcher Daniel Webster with the insight, Posner writes in The Harvard Education Letter, "...many conflict resolution programs teach the kinds of negotiation skills that may be useful for middle-class students whose disputes stem from competing interests, but not for poor, high-risk youth for whom violent conflict is often the result of macho posturing and competition for status."

"For this reason," says Lumbley, "administrators selecting programs ought to ask if they have been implemented elsewhere, and then should follow up on those referrals. Determine the similarities and differences between the referral schools and one's own. Ask how well students in the referral schools have accepted the program, what the program has done for the schools, and what difficulties occurred during program implementation.

"And," Lumbley advises, "go beyond the schools mentioned in the referrals to the local courts, the juvenile justice system, and social service agencies. Ask the professionals in these institutions for their perspectives on the programs. Specialists in state education agencies who track curricula may also be a source of reliable program appraisals."

School violence is systemic and stretches beyond the schoolyard. That's all the more reason to contact other institutions, Lumbley says. With such alliances and a proven, appropriate program, educators can better assure that students are safe.

For more information, and to order copies of the sources cited in this article, contact the publishers.

Posner, Marc (1994, May/June). "Research Raises Troubling Questions about Violence Prevention Programs." The Harvard Education Letter, 12, 3, 1-5. Call 617/495-3432.

Tolen, Patrick and Guerra, Nancy (1994). What Works in Reducing Adolescent Violence: An Empirical Review of the Field. Boulder, CO: Center for the Study and Prevention of Violence, University of Colorado, 1-97. Call 303/492-1032.


Next Article: Project ExCELL: SEDL and the TEA are Partners in Setting Language Education Standards