Unspun

After quoting Eastin's press release extensively, the Ex's Larry Hatfield veered off-script and made a district-to-district comparison in order to take a whack at schoolchildren in the "well-to-do North Bay counties." Schools there reported much higher rates of drug and alcohol crimes, he noted, than their counterparts in S.F.

Give them a hot tub and a Jag and what do you expect?
Clearly, Hatfield and Asimov never consulted with West County Times staff writer Laura Bendix. In her account, state officials "admit" the assessment "is seriously flawed." And it "was released with a warning to the public: Don't jump to any conclusions."

Unfortunately, Bendix never pursued her inquiry to its logical end, i.e., that the assessment is meaningless. Instead, she ended by repeating without challenge Eastin's prepared statements to the effect that California schools are safe places.

Bendix had more incentive than the others to find fault with the study, because the data, such as it is, painted a dark picture of the public schools in the heart of her chain's circulation area. The San Ramon Valley, Mount Diablo, and Pleasanton school districts scored higher than their urban counterparts in a number of crime categories.

In one instance, the San Ramon district spokesman complained to Bendix that all of the Oakland schools, with nearly triple the number of students, reported a ridiculously low seven drug and alcohol incidents for the entire school year, vs. 62 for San Ramon. "If you believe that, I've got a bridge to sell you," he told Bendix.

The Trib positively chortled the good news about its public schools, which was no doubt welcome after the recent, protracted Ebonics flap. But then, reporter Jonathan Schorr had an added advantage. He was able to work off of the Oakland School District's own internal crime survey, which has been conducted for years, and which -- happily -- was released that same day. Oakland's own numbers showed a drop in crime, which is the source for the 43 percent figure in Schorr's story. But even Oakland officials are a little chagrined at the paper's glowing coverage.

"We never gave a number to it," said Sherri Willis of the Oakland schools' press office in an interview last week. "All we're saying is that the crime trend is down, except for property crimes." Willis noted that crime overall had dropped in Oakland, and the schools naturally reflected that occurrence.

Schorr said later that Oakland's internal survey warranted further investigation, especially the 43 percent decline it seemed to have found, "as if we had no crime at all," Schorr said.

As for the state's numbers, Schorr said: "They were worthless in terms of Oakland. I stayed away from them." Questions about the methodology aside, months at a time went by when no reports were filed by a number of schools, he said. Those lapses would seriously undermine any conclusions that could be drawn from the state's figures on Oakland, which would give the lie to observations like the one Asimov made about Oakland schools being safer.

The bureaucrats in the state Education Department deserve much of the credit for this mess. Their instinct for self-preservation has overridden considerations of accuracy and clarity. For example, the executive summary of the assessment opens by giving a number of reasons that the survey was taken. Near the top is this one: "It will help eliminate unnecessary exaggerations about the safety of our schools brought on by sensational accounts of isolated negative events."

In a similar vein, the summary warns that crime might have been underreported because administrators failed to do their jobs correctly. When their performance improves, the numbers may go up, the report explains. In that fashion, officials have a ready-made excuse by which they will avoid being held accountable if crime figures rise in the next survey. They can dismiss the increase as a statistical fluke.

Still unanswered, though, is the question of whether the assessment's numbers have any significance at all. Asked point blank if the survey results were accurate, Safety Assessment Program Administrator Gail Evans said last week that they "are an accurate version of what was reported to us." There was no statistical way to correct for the faulty compliance rate, she said. And in this first year, the department didn't check often enough to even know how badly the assessment was mishandled.

Mark Duerr, whose company, Duerr Evaluation Resources, actually ran the survey, said Monday that "the data isn't perfect." He said this year was "a start-up." The findings were made public because "[t]he Legislature required us to report them."

While Evans resisted dismissing the '95-96 figures outright, she conceded that " '96-97 might be a better baseline." Then why release them? "This report was required by the Legislature this year."

And the next, and the next, and the next. Or, at any rate, until the time when the politicians and the bureaucrats determine the findings are too clear and too politically damaging, even when laundered for public consumption.

Phyllis Orrick can be reached at SF Weekly, Attn: Unspun, 425 Brannan, San Francisco, CA 94107; phone: (415) 536-8139; e-mail: porrick@sfweekly.com.

« Previous Page
 |
 
1
 
2
 
All
 
My Voice Nation Help
0 comments
Sort: Newest | Oldest
 
©2014 SF Weekly, LP, All rights reserved.
Loading...