Since our cultural communications have moved increasingly online, often to spaces controlled by large corporations such as Facebook, Google, and Twitter, free-speech advocates have watched in dismay as those entities have inconsistently enforced content policies. Unlike U.S. government entities such as public libraries and schools, private corporations can remove whatever they wish. Users may cry foul when a takedown seems to contradict the site’s own Terms of Service, but ultimately they have no legal recourse. That’s why an ambitious new project from the Electronic Frontier Foundation aims to at least add some transparency to the process.
At the new site, OnlineCensorship.org, users can report when their content has been censored on any of six major social media or video/image-sharing sites including Facebook, Twitter, and YouTube. The main goal is not necessarily to get each individual’s verboten postings restored, but to build a database that will allow EFF and its partner Visualizing Impact to draw a clearer picture of what gets censored on each site and how often companies ignore their own rules in their moderation decisions. Facebook, for instance, has a well-known problem with female nipples even on works of art that should be allowed according to its Community Standards.
One issue that EFF and Visualizing Impact hope to highlight is the often unfair nature of community moderation. In many cases groups of users or even government officials have used social media reporting to anonymously censor content they simply don’t agree with. In fact, the idea for OnlineCensorship.org was conceived after EFF’s Jillian York and Visualizing Impact CEO Ramzi Jaber noted that posts advocating for Palestinian statehood often disappear after being reported as “abusive,” particularly on Facebook. The social media giant also has deleted posts including videos or images of Tibetan monks who self-immolate to protest the Chinese government’s human rights abuses in their homeland. There is no doubt that such videos are disturbing, which is precisely the point of this ultimate act of protest. The OnlineCensorship.org partners hope to heighten social media users’ awareness of just what corporations are “protecting” them from, and whether they might rather judge that content for themselves.
If you’ve ever had content censored by any of the sites currently covered in the OnlineCensorship.org database, consider submitting a report here. (Note that they are not looking for information about copyright takedown notices. While those certainly can be used to censor content, they’re already tracked through the EFF-involved project Lumen, formerly known as Chilling Effects.) Even anonymous reports will help to build data visualizations, and eventually perhaps lead to more transparency and a more just model of content policing from social media companies. The site also has compiled information about how to appeal moderation decisions from each service included.
Contributing Editor Maren Williams is a reference librarian who enjoys free speech and rescue dogs.