Dirty Numbers on School Safety
Schoolyard violence is always an emotionally hot topic, so it's understandable that state Education Department officials were cautious last month when they released the first statewide survey of school crime in nearly a decade. But being cautious is not the same as distorting reality.
Closer examination of the 1995-96 Safe Schools Assessment suggests it was so statistically flawed, the schools would have been better served had it simply been shelved. But political considerations, in the form of the Legislature, which specifically mandated the survey, forced the Education Department to go public with the findings, no matter how phony. And on their release, the department engaged in public relations maneuvers that simultaneously obscured the assessment's fundamental weakness and milked the phony results to political advantage. And in a final, cynical dollop, the press largely took the bait, reporting on the survey without ever questioning its merit.
Not that the media didn't have some considerable help from State Schools Superintendent Delaine Eastin. She tightly choreographed the publicity campaign surrounding the release of the findings. School district administrators were consulted beforehand, so they wouldn't be taken by surprise, and just as important, the press was handed a detailed script suggesting how to frame the coverage.
In the Bay Area, the bureaucrats and politicians were well-served with a flurry of neutral to positive stories about the safety of California's public schools. An added bonus for educators was this new bank of supposedly objective crime data, which they could then use to justify asking for more money for future anti-crime programs. And, somehow, Education Department officials managed to hedge the survey's figures so thoroughly that those numbers will be useless in measuring the effectiveness of any new programs down the road.
Eastin's strategy worked smoothly: Same-day stories ran in five large Bay Area dailies -- the Chron and the Ex in S.F., the West County Times and the Oakland Tribune in the East Bay, and the Marin Independent-Journal in the North Bay.
But that's where the survey's internal flaws began to manifest themselves, as different papers gave the numbers widely differing interpretations. Eastin and other officials in her agency only added to the confusion when they insisted this year's data was not intended to be interpreted. Yet as the story unfolded, Eastin herself violated this contradictory and confusing advice, and certainly never gave a satisfactory explanation for it.
Let's start with how the press responded. "Crime at School Costs Millions" was the Chron's headline; "S.F. schools get high marks in crime study" trumpeted the Ex on a defiantly parochial note; "Crimes in School Plummet" screamed a huge three-decker on the Trib's front page, with the kicker "Big 43% drop baffles Oakland authorities" (In actuality, it was a decline found in Oakland's separate, internal crime survey of long standing, which was released the same day); "Officials: School crime data faulty," demurred the West County Times (a member of the Contra Costa Newspapers chain) in a contrarian, yet ultimately accurate, interpretation; and, finally, the I-J's dry-as-dust, but perhaps closest to the truth: "Schools in State report 80,000 crimes committed."
Are we all reading off the same page here?
The reporters reacted to the numbers in the survey in precisely the way Eastin urged them not to do. They spun them to fit their markets and their assumptions. That's their job. But in chasing their own angles, reporters overlooked Eastin's remarkable attempt to downplay the validity of the findings, even as she tried to put a positive spin on them. And they failed to point out the long shadows of doubt Eastin's behavior and internal notes in the report itself cast on the assessment.
"New data on school crime dispel the myth that our schools are unsafe" reads the opening of Eastin's press release. A direct quote follows, in which she says, "Our schools, in fact, are generally safer places for our students to be than their surrounding communities."
So much for not drawing conclusions from the numbers.
Next, bizarrely, Eastin cautions anyone else against doing what she has just done. And why? This year's survey numbers are so flawed as to be nearly meaningless. And why is that? Because school administrators failed to report all the crimes that occurred. They either simply chose not to do so, or didn't know how.
What Eastin's comments suggest is that the study could be irrelevant. But all five stories -- even the West County Times' skeptical account -- glossed over that inherent potential contradiction. After all, the circular thinking goes, if the assessment were so badly flawed as to mean nothing, the state wouldn't have released it, and there'd be no story. Pursuing the other alternative, the possibility that the state Education Department squandered $1.2 million on a hollow exercise merely to appease the Legislature, would have required a significant departure from Eastin's handy script.
The dailies proved faithful to Eastin's intended plot. Chron K-12 education reporter Nanette Asimov focused on the $22.7 million that property crimes supposedly cost schools last year. Asimov's second paragraph echoed Eastin's positive spin: The "new data" demonstrates that "public schools are not the hotbeds of violent activity many people perceive them to be." Asimov also swallowed without question the highly dubious stats in the state survey that suggested Oakland's schools were practically crime-free last year.
After quoting Eastin's press release extensively, the Ex's Larry Hatfield veered off-script and made a district-to-district comparison in order to take a whack at schoolchildren in the "well-to-do North Bay counties." Schools there reported much higher rates of drug and alcohol crimes, he noted, than their counterparts in S.F.
Give them a hot tub and a Jag and what do you expect?
Clearly, Hatfield and Asimov never consulted with West County Times staff writer Laura Bendix. In her account, state officials "admit" the assessment "is seriously flawed." And it "was released with a warning to the public: Don't jump to any conclusions."
Unfortunately, Bendix never pursued her inquiry to its logical end, i.e., that the assessment is meaningless. Instead, she ended by repeating without challenge Eastin's prepared statements to the effect that California schools are safe places.
Bendix had more incentive than the others to find fault with the study, because the data, such as it is, painted a dark picture of the public schools in the heart of her chain's circulation area. The San Ramon Valley, Mount Diablo, and Pleasanton school districts scored higher than their urban counterparts in a number of crime categories.
In one instance, the San Ramon district spokesman complained to Bendix that all of the Oakland schools, with nearly triple the number of students, reported a ridiculously low seven drug and alcohol incidents for the entire school year, vs. 62 for San Ramon. "If you believe that, I've got a bridge to sell you," he told Bendix.
The Trib positively chortled the good news about its public schools, which was no doubt welcome after the recent, protracted Ebonics flap. But then, reporter Jonathan Schorr had an added advantage. He was able to work off of the Oakland School District's own internal crime survey, which has been conducted for years, and which -- happily -- was released that same day. Oakland's own numbers showed a drop in crime, which is the source for the 43 percent figure in Schorr's story. But even Oakland officials are a little chagrined at the paper's glowing coverage.
"We never gave a number to it," said Sherri Willis of the Oakland schools' press office in an interview last week. "All we're saying is that the crime trend is down, except for property crimes." Willis noted that crime overall had dropped in Oakland, and the schools naturally reflected that occurrence.
Schorr said later that Oakland's internal survey warranted further investigation, especially the 43 percent decline it seemed to have found, "as if we had no crime at all," Schorr said.
As for the state's numbers, Schorr said: "They were worthless in terms of Oakland. I stayed away from them." Questions about the methodology aside, months at a time went by when no reports were filed by a number of schools, he said. Those lapses would seriously undermine any conclusions that could be drawn from the state's figures on Oakland, which would give the lie to observations like the one Asimov made about Oakland schools being safer.
The bureaucrats in the state Education Department deserve much of the credit for this mess. Their instinct for self-preservation has overridden considerations of accuracy and clarity. For example, the executive summary of the assessment opens by giving a number of reasons that the survey was taken. Near the top is this one: "It will help eliminate unnecessary exaggerations about the safety of our schools brought on by sensational accounts of isolated negative events."
In a similar vein, the summary warns that crime might have been underreported because administrators failed to do their jobs correctly. When their performance improves, the numbers may go up, the report explains. In that fashion, officials have a ready-made excuse by which they will avoid being held accountable if crime figures rise in the next survey. They can dismiss the increase as a statistical fluke.
Still unanswered, though, is the question of whether the assessment's numbers have any significance at all. Asked point blank if the survey results were accurate, Safety Assessment Program Administrator Gail Evans said last week that they "are an accurate version of what was reported to us." There was no statistical way to correct for the faulty compliance rate, she said. And in this first year, the department didn't check often enough to even know how badly the assessment was mishandled.
Mark Duerr, whose company, Duerr Evaluation Resources, actually ran the survey, said Monday that "the data isn't perfect." He said this year was "a start-up." The findings were made public because "[t]he Legislature required us to report them."
While Evans resisted dismissing the '95-96 figures outright, she conceded that " '96-97 might be a better baseline." Then why release them? "This report was required by the Legislature this year."
And the next, and the next, and the next. Or, at any rate, until the time when the politicians and the bureaucrats determine the findings are too clear and too politically damaging, even when laundered for public consumption.