Quantcast
Facebook Adds 3,000 Pairs of Eyes - By Stephen Jackson - May 10, 2017 - SF Weekly
SF Weekly

Facebook Adds 3,000 Pairs of Eyes

Courtesy Photo

Last Wednesday, Facebook announced plans to hire 3,000 employees worldwide to address growing concerns over inappropriate content being shared online. This comes as an addition to the 4,500 employees already tasked with monitoring inappropriate, harmful, and, at times, gruesome content.

April was not a good month for the Facebook Live platform. On April 16, a Cleveland man uploaded a video of him killing an innocent elderly man for no apparent reason. A manhunt ensued, and the killer ultimately took his own life. On April 24, in what may be the most horrific social-media upload to date, a Thai man hung his 11-month-old daughter in a series of two live-streamed videos and subsequently killed himself.

Powerless, the child’s mother witnessed the murder in real-time. The footage remained on the man’s page for 24 hours after police encountered the two bodies, and garnered a combined 370,000 views.

“This is an appalling incident, and our hearts go out to the family of the victim. There is absolutely no place for acts of this kind on Facebook,” a company spokesperson said in the wake of the Thailand incident. However, with Facebook announcing last week that it’s reached nearly two billion monthly active users — 26 percent of the world’s population — it is clear the company needed to show a more concerted prevention effort than statements like the one above.

“If we’re going to build a safe community, we need to respond quickly,” wrote CEO Mark Zuckerberg in a Facebook post last week, announcing the staff increase. “We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.”

Facebook receives “millions of reports” of people potentially hurting themselves or others each week. In addition to nearly doubling staff tasked with acting on these reports, it is also building more tools to “keep our community safe.”

“We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards, and easier for them to contact law enforcement if someone needs help. As these become available, they should help make our community safer,” Zuckerberg continued.

Monitors review content once it is flagged and determine whether or not it should be taken down. However, it’s not clear whether the new hires will be Facebook employees or independent contractors around the world.

And the job itself is taxing. Three months ago, two Online Safety Team members sued Microsoft after claiming to have developed Post-Traumatic Stress Disorder from watching too much disturbing content.

The announcement caused the Electronic Frontier Foundation (EFF), a digital civil-liberties nonprofit, to cast a wary eye toward Facebook.

“There’s all kinds of content on social media that people can legitimately complain about, and that companies have the legal right to scrub from their platform,” EFF staff attorney Sophia Cope tells SF Weekly. “But if they’re going to do that, then EFF’s position is consistently that they need to do it in a very careful and transparent manner so as not to unduly impact the purpose behind these platforms, which is to enable that average person to access a platform for spreading information far and wide, and receiving information.”

When asked whether or not it would be reasonable to take down last month’s Cleveland murder or the Thai infanticide, Cope says EFF does not take an official position on any specific video online.

“The fact of the matter is that sometimes there is violent content that is revealing human rights abuse, for example. Or it has journalistic value, academic research value, or scientific value,” Cope says, pointing to the fact that much of the recent depictions of police brutality online have spurred a great deal of public debate over the use of force.

While EFF’s neutral position regarding any specific video is understandable, it’s a non-starter to argue that leaving last month’s videos (or similar footage of suicides, sexual assaults, and revenge porn) online to rack up millions of views is appropriate.

But take a look at the Vietnam War. Horrific images of violence abroad — shared with the public for the first time — were catalysts for turning the tide of opposition against the conflict at home. How might the Arab Spring have been different without raw depictions of injustice spread across Facebook? Without footage of abusive and racist cops caught in the act, would we have seen the same nationwide push for officers to wear body cameras?

In fact, Facebook drew fire last year when they censored Nick Ut’s iconic image of Phan Thi Kim Phuc, a 9-year-old girl running from napalm bombs in Tang Bang in 1972. They received such enormous backlash that they restored the image online only hours after taking it down.

“An image of a naked child would normally be presumed to violate our community standards, and in some countries might even qualify as child pornography,” wrote Facebook in a statement regarding the image. “In this case, we recognize the history and global importance of this image in documenting a particular moment in time.”

As the line between what happens online and “in real life” becomes increasingly blurred, the issue of what to do with harmful — or potentially powerful — content only gets more complicated. At what point does a violent video with the ability to incite social change become a snuff film, and vice-versa? Is an image of a soldier murdering someone more “educational” than live-streamed domestic violence?

Perhaps we have a problem about things feeling too personal. When we can place victims in a relatable place and time, it feels more real because we can potentially see ourselves — or someone we love — as the object of harm.

However, that line of reasoning doesn’t hold up when it comes to determining what should and shouldn’t remain online. Everything is personal to somebody. No matter what, a horrific video is hitting close to home somewhere.

After Ut’s Vietnam image was taken down, Espen Egil Hansen, editor in chief at Aftenposten, Norway’s largest newspaper, said that Zuckerberg is now the “most powerful editor-in-chief in the world,” and he may be right. By many standards, Facebook is the largest media company on the planet — whether it wants to admit it or not. If that’s the case, at what point do guidelines such as those prohibiting content in network television, magazines, and newspapers (like this one) apply?

For now, it seems the focus should be on promoting transparency in matters of corporate censorship as well as the rightful scrubbing of content that falls outside a company’s terms of service. Let’s hope to see a Facebook post in the near future outlining the specific criteria by which material online is slated for removal.