By Anna Pulley
By Erin Sherbert
By Chris Roberts
By Erin Sherbert
By Rachel Swan
By Joe Eskenazi
By Erin Sherbert
By Erin Sherbert
"Did anybody do a technical review of your work?"
"Yes," Bavarian replied.
"My colleagues and my friends."
"Your Honor, I think this is misleading," he said somberly. "I really do."
If it seems odd that a relatively untested science, explained by a relatively unknown scientist, should be considered as evidence by a panel of men and women deciding whether to send a young man to prison for the rest of his life — well, it is.
State and federal courts ostensibly have rigorous tests for establishing the admissibility of scientific evidence. The goal, as a California appellate judge wrote in 1998, is to avoid bestowing "a misleading aura of certainty or a posture of mystic infallibility" on quackery.
"The traditional fear is that the jury will much more readily accept something that an expert says is true," said Michael Saks, a professor at Arizona State University College of Law and an expert on scientific evidence. "The official doctrine is, you want to move slowly and carefully and not let something in until it's really good."
Yet this doctrine has been more honored in the breach than in the observance, according to Saks. Many novel brands of forensic science have been heedlessly allowed into high-stakes trials, and withdrawn from courtroom use only when later called into question. Forms of "scientific" evidence that have been accepted and then debunked include voice-print identification, bullet-lead analysis, blood-spatter studies, handwriting, bite marks, and certain burn patterns that were once thought to indicate arson.
Strange to say, in the 19th century, before forensic fingerprinting became established, a primitive form of biometrics that measured people's heads and limbs — known as Bertillonage after its creator, a French police clerk named Alphonse Bertillon — was the dominant means of identifying suspects in the criminal-justice system.
"We have a long and somewhat ambivalent tradition of using biometrics for legal identity," said Jennifer Mnookin, a professor at UCLA School of Law and an expert on forensics and scientific evidence. "The question is, how well does it work? And how well does it work from a blurry photograph?"
There's a simple answer to that question, according to some biometrics authorities: not well. Whatever Bavarian's friends may have thought about his work on the Charles Heard case, it turns out that some of his colleagues have reservations about it.
Academic experts in facial-recognition techniques interviewed by SF Weekly, along with an FBI forensics specialist who testified on behalf of the prosecution in Heard's trial, expressed skepticism both about facial recognition's readiness for the courtroom and the specific methods used by Bavarian.
"Any biometrics system can make a mistake," said Anil Jain, a Michigan State University professor of computer science and engineering and a biometrics expert. "In the case of face, the accuracy is worse, because face changes with respect to illumination, with respect to aging. ... I think face recognition, particularly for surveillance applications, is not quite ready for automated identification."
Jain also raised questions about Bavarian's comparison of the images of Heard and the video of the supposed shooter. After a phone interview with SF Weekly, he reviewed the "Biometric Analysis" submitted to the court by Bavarian and, in a subsequent e-mail, highlighted what he saw as several weaknesses in the report.
In Heard's case, Jain noted, "It is not surprising that they have an inconclusive result given the quality of video. ... The problem of comparing a high-resolution image with the low-resolution video is challenging." Additionally, he noted that Bavarian's decision to perform a comparison of the faces in the images by hand, rather than using a computer program, "introduces a lot of subjectivity."
Asked about these criticisms, Bavarian said he had performed the analysis manually because the low-quality images from the surveillance tapes could not be accurately fed into automated biometrics software. He acknowledged a risk of subjectivity in his method, but said it had been minimized — and the poor quality of the images corrected for — by comparing Heard's jail photo to dozens of still frames of the shooter gleaned from the video footage, rather than just one or two.
Perhaps the most damning review of Bavarian's work came from the expert witness Swart put on the stand to debunk his analysis.
Richard Vorder Bruegge is a forensic scientist and photographic technologist employed by the FBI. He has a Ph.D. in geological sciences from Brown University, a full head of silver hair, and youthful good looks, presenting a stark contrast to Bavarian's vaguely reptilian allure.
The authoritativeness of Vorder Bruegge's résumé was almost comical. He worked with NASA on the Clementine mission, which carried out the first image mapping of the moon in 1994; he is the chairman of the Facial Identification Scientific Working Group — the same group Bavarian had claimed was "not relevant"; and he came to San Francisco for Heard's trial just two weeks after delivering the keynote address on facial identification at a biometrics conference in Australia.
On the stand, Vorder Bruegge argued that the technique used by Bavarian — measuring the distances between facial features, creating ratios, and then comparing them — was "not reliable," because it depended too much on such variables as a subject's pose and the angle of the camera. A more fundamental problem in Heard's case, he said, was that the image gleaned from the low-resolution surveillance video simply wasn't fit for scientific analysis.