Supervisor Aaron Peskin’s office has become well-known for its commitment to preserving privacy, in spite of (or perhaps because of) the prevalence of technology companies a stone’s throw away — several of which have crossed the line between preserving people’s information and leaking it. Last November, San Francisco voters passed Proposition B, a measure Peskin put on the ballot to affirm one’s right to know how their personal information was being used, and to opt out of that usage if desired.
Now, his office has dedicated itself to two new privacy-related tasks: shining a light on what types of surveillance city governments employ and why, and placing a ban on the use of face-recognition software. This legislation would, Peskin claims, “pull back the shroud of secrecy that is draped over the use of surveillance industries.”
The legislation, introduced in Tuesday’s Board of Supervisors’ meeting, would require city agencies to submit to the board a Surveillance Technology Policy ordinance and a Surveillance Impact Report before they can purchase or borrow new surveillance technology. Before such technology can be installed, its location and use must be disclosed.
In addition, it would ban the use of controversial facial-recognition technology by city agencies, a tool which Peskin called “a coercive and dangerous weapon.”
“It is essential to have an informed public debate as early as possible about decisions related to surveillance technology,” the legislation reads. “While surveillance technology may threaten the privacy of us all, surveillance efforts have historically been used to intimidate and oppress certain communities and groups more than others, including those that are defined by a common race, ethnicity, religion, national origin, income level, sexual orientation, or political perspective.”
The latter is particularly important. A number of research studies show that face recognition software is not yet developed enough to accurately differentiate between people of color or women, posing the potentially serious risk of misidentification.
Or, as Peskin points out, “it reflects the biases of the people who develop it.”
While the legislation incorporates all city agencies, it’s a no-brainer that the San Francisco Police Department would be most affected by such a policy. If given free rein, SFPD could in theory use facial-recognition software to track down missing children, automatically identify suspects, or screen for known offenders. It could even be used in combination with officers’ body cameras, providing walking face-recognition capabilities.
While sweeping, the Stop Secret Surveillance ordinance and its ban on face recognition were not created in a bubble. Last year, the state Senate passed Senate Bill 1186, which as of July 1, requires police to submit a Surveillance Use Policy disclosing their technology and types of information collected at a public meeting each year.
And the issue has been raised by other local agencies. Former BART Board Director Nick Josefowitz proposed installing facial-recognition software in the transit agency’s camera systems before that plan was shot down.
“As goes San Francisco, so goes California, as goes California, so goes the nation, so let’s give it a shot here in S.F.,” Peskin said. “San Francisco is a city of firsts.”
The legislation will land in the Rules committee for an initial vote in the coming weeks.