Charlotte Silver 16 November 2016
Korryn Gaines was shot to death by Baltimore police in August, after Facebook complied with their request to cut off a live broadcast she was making during a traffic stop. (via Facebook)
Pressure is growing on Facebook to shed light on its decisions to censor content and user accounts.
“It’s not just a platform where people are getting news, but it’s increasingly a platform where people are documenting human rights injustices and breaking news,” Chinyere Tutashinda of the Center for Media Justice told The Guardian last month.
The Center for Media Justice is one of more than 70 organizations that wrote to Facebook CEO Mark Zuckerberg demanding that his company adopt a more transparent policy on removing content.
Palestine Legal, the American Civil Liberties Union, Color of Change, 18MillionRising and Dream Defenders also signed the letter, which specifically points to the disabling of Palestinian journalists’ accounts and the removal of Black activists’ content as examples of Facebook’s censorship.
It also refers to the deactivation of the account of Korryn Gaines.
“When the most vulnerable members of society turn to your platform to document and share experiences of injustice, Facebook is morally obligated to protect that speech,” the groups write.
“We welcome feedback from our community as we begin allowing more items that people find newsworthy, significant or important to the public interest,” a spokesperson for Facebook told The Guardian.
Facebook is meeting with organizers of the letter this week.
In August, Facebook granted an emergency request from Baltimore police to deactivate the account of 23-year-old Korryn Gaines, who was broadcasting her standoff with police on Facebook until her account was shut down.
After Gaines’ live broadcast was cut off, police shot her to death.
A Facebook spokesperson told The Intercept that the company complied with police requests to deactivate Gaines’ account because her followers were “encouraging violence,” and their censorship aimed to prevent “physical harm or death.”
In late September, Facebook temporarily disabled the accounts of journalists who administer pages of two of the most widely read Palestinian publications on the Internet.
Facebook quickly claimed the accounts had been disabled by mistake, but the publications believed the incident was related to the recent agreement between Facebook and the Israeli government to closely monitor Palestinian accounts for what Israel claims is “incitement.”
Even before the agreement, Facebook regularly cooperated with the Israeli government, removing content at its request or providing it with data on users.
According to the company’s own records, it complied with 60 percent of government requests for user data and censored 236 pieces of content in the second half of 2015.
Israel defines incitement expansively. Expressing any kind of sympathy on social media with Palestinians killed by Israeli forces can lead to charges of incitement.
This week, Israeli authorities acquitted journalist Khaled Maali on the condition that he deactivate his Facebook account and pay a fine of $1,700. Maali, 48, is from Salfit in the occupied West Bank.
Facebook has also been accused of widely censoring posts and disabling accounts after users posted information related to Kashmir, where the Indian government is waging a brutal crackdown on protesters.
Records show Facebook complied with 50 percent of Indian government requests on user data and restricted nearly 15,000 posts in the latter half of 2015.
Facebook has defended its censorship of posts related to Kashmir and its cooperation with Israel by stating that there is “no place for terrorists or content that promotes terrorism on Facebook.”
But how Facebook determines whether a post crosses the line is unclear.
In the letter to Zuckerberg, the civil and human rights organizations list four recommendations to make Facebook a more equitable and transparent platform.
They ask Facebook to clarify how it decides to censor content and to make those policies clear and accessible to the public. They also recommend that Facebook implement an appeals process for censored content as well as refuse to disclose customer content and data unless required by law.
The letter was sent to Zuckerberg about a week after Facebook announced that it was reconsidering how to make decisions about removing content.
But it was not until Facebook took down the famous, Pulitzer prize-winning photograph of children fleeing a napalm attack during the Vietnam war, that the company questioned its policy.
In the photo, one of the children, 9-year-old Kim Phúc, is naked, her clothes burned off by the napalm. Facebook explained it had removed the photograph because it violated the company’s prohibition of images displaying “fully nude genitalia or buttocks, or fully nude female breast.”
Following an international backlash, Facebook eventually conceded its misstep and restored the image.
“We recognize the history and global importance of this image in documenting a particular moment in time,” Faceboook stated. “Because of its status as an iconic image of historical importance, the value of permitting sharing outweighs the value of protecting the community by removal.”
The following month, Facebook announced that it would begin to allow “more items that people find newsworthy, significant, or important to the public interest — even if they might otherwise violate our standards.”
Reem Suleiman from SumOfUs, one of the groups that initiated the letter, told The Electronic Intifada that Facebook’s decision to allow more content is a good step, but she emphasized that it still needs to be more transparent about its relationship with law enforcement and other government agencies.
It remains unclear how Facebook will determine whether an item meets this new standard of “newsworthy and significant.” While a prize-winning photo from 1972 may qualify, material important to groups facing systematic state violence today – including Kashmiris, Palestinians and African American activists – may not.