Stronger guidelines and more transparency are needed to combat censorship on social media
This article originally appeared on OpenCanada.org on September 29, 2017 as part of a larger series with 6 Degrees Citizen Space 2017 speakers entitled Walls that need to go: Ideas for a more inclusive world. See full post here.
In the wake of the Occupy movement and the Arab uprisings, an abundance of tech-positive narratives emerged. The internet, it was said, could break down barriers between the strong and the weak, serving as a great equalizer. Pointing to the use of social media to coordinate demonstrations in Egypt, Tunisia, and elsewhere, optimists viewed an opportunity for a new digital public square that would facilitate communication between progressive actors.
Yet cyberspace can also replicate many of the barriers and hierarchies that exist offline. In contrast to the belief that the internet is an open market of ideas, every social media platform has a set of community guidelines that it moderates at its own discretion. Since platforms such as Facebook, Twitter and YouTube are corporate-owned and have little accountability to their users, these policies can form a major barrier to free expression. Social media platforms often lack transparency toward their users and have at times conceded to the demands of governments as they shape their policies. Moreover, vulnerable groups tend to be targeted on the internet and those with political and economic power have the opportunity to dictate what content gets censored.
The case of Palestine is a prime example. Twenty-six Palestinian journalists are currently under administrative detention by the Israeli government. Meanwhile, political dissent or news about the Israeli occupation shared online has repeatedly been flagged as “incitement” and taken down. The notion of incitement is sufficiently vague that legitimate political criticism is being removed from the internet at the Israeli government’s request. In 2016, after meeting with Israeli government officials, Facebook disabled the accounts of three editors from Al Qudsnewspaper and five editors from Shehab News Agency. Due to public pressure, Facebook eventually apologized for this “mistake,” but offered no explanation for why it occurred.
Palestine is merely one instance of a global phenomenon in which dissenting voices are being stifled by social media platform policies. In Australia, Indigenous feminist activist Celeste Liddle’s Facebook account was banned four separate times for sharing a trailer of an Indigenous comedy show that featured images of topless women. In 2017, a coalition of 77 social and racial justice organizations wrote to Facebook about consistent and disproportionate censorship of Facebook users of colour, including takedowns of images discussing racism. Last year, Facebook censored a video of a mass arrest of 22 activists at a Dakota Pipeline protest.
How can we break down barriers to free expression online? It is clear that the community guidelines and content moderation practices of major social media platforms need a rethink. Users must demand greater transparency and communication. A clear set of policies must be made accessible to all and applied evenly and fairly. Moreover, a public appeals platform for content takedowns with due process is necessary for social media platforms to be accountable to their users. When content is censored, platforms must explain why, and users must have the opportunity to challenge these decisions. An engaged citizenry that values the positive power of social media as a platform for public expression can help to realize this vision.
Ramzi Jaber is co-director of Visualizing Impact (VI), a non profit that specializes in data visualization on social issues. In partnership with the Electronic Frontier Foundation, VI runs Onlinecensorship.org, which encourages social media platforms to operate with greater transparency in their approach toward content moderation.
Robin Jones is a research and project coordinator at Visualizing Impact.