Limits of Freedom of Speech: Reddit’s Child Pornography problem

Several weeks ago, the popular message board Reddit announced that it was making a policy change to ban all “suggestive or sexual content featuring minors.” Owned by Advanced Publications, Reddit has made a name for itself in part by its hands-off, pro-free-speech, let-the-users-decide, and self-police approach. In fact, before the policy change, the only rules of the site were no spamming, no cheating, no personal info, nothing illegal, and no interfering with the site’s functionality. Small wonder, then, that this decision brought about a dramatic reaction from the Reddit community, although one can argue that child-porn is illegal, so technically there has been no rule change. Be that as it may, some saw that this decision went against the very nature of Reddit, while others were completely on board with it. One user claimed passionately:  “For better or for worse, Reddit has moved from a non-interventionist to a policing organization.”

No doubt, Reddit’s hands-off policy towards the content posted on its site is markedly different than that of Facebook which is promoting a sanitized alternative to the darker corners of the Internet within its gated community. As exposed in ad nauseam in a recently leaked document called the Abuse Standard’s Violation, Facebook has banned a lot of content ranging from camel toes to women breastfeeding. Tarleton Gillespie rightfully contends that, through the arbitrary rules delineated in this document, Facebook is able to play the custodian and is ultimately the arbiter of public discourse.  The fact that private corporations are able to assume the important role of determining, or helping to determine, what is acceptable as public speech is frightening, for sure. Mostly because, as Gillespie notes, sites like Facebook are relatively obscure about how they manage their custodial duties and they rather not draw attention to the presence of so much obscene content on their sites, so they regularly engage in censorship to expunge it.

This post, with the Reddit case, sets out to explore the other end of the spectrum.

To be sure, this difference lies in the fact that the two sites provide very different types of services and thus have very different user base, but what brings them together is our concern for freedom of speech on the Internet. This is a valid concern, but unfortunately, the concept itself has degenerated into a gimmick, a tagline of some sorts, used by the sincere and the criminal alike, used for different purposes, for sure, but used nonetheless.

In a manner of speaking, Reddit presents a test case for the possibilities of what freedom of speech could bring about. Users are able to open subreddits on the topics of their choice and they are able to vote a particular post up or down which, ultimately, earns the owner of a particular post “karma” points. It is an organized chaos, if you will, a veritable democracy, not unlike the boards of 4chan or Something Awful, and it turns out, it harbors a very dark side of the participatory web.

A cursory glance at the darker Reddits posted on the site, however, clarifies what appeared to be a dramatic change in policy that took effect several weeks ago. For the last couple of years, several scandals brought attention to some of the questionable content housed by the site. As diligently documented by various sites, in particular Gawker, Reddit users kept creating subsections that promote pedophilia and other content such as raping and battering of women, pictures of dead kids, killing black people (replace this euphemism with the “n” word) and/or women, “choking a bitch,” and other equally jaw dropping topics that make you feel like you are staring point-blank at the heart of darkness as depicted by Joseph Conrad. The only thing that keeps the general public from accessing this juicy content is a cute little Reddit mascot that asks you if you are 18 and are willing to see adult content.

The pedophilia sections of Reddit were first brought to attention of the mainstream media back in October 2011 following Anderson Cooper’s detailed coverage of the darker side of these message boards. The Jailbait reddit was the home of more than 20,000 users who posted pictures of scantily-clad—but clothed nonetheless—teens—many of which were stolen from people’s Facebook profiles.

In response to Cooper’s prime time coverage, Reddit co-founder Alexis Ohanian defended Reddit’s content claiming that Reddit doesn’t host the material, but rather, that the website is merely a repository of links that go to other sites on the Internet, and as such, it functions like Twitter which also has links to such contents. Ohanian argued that, instead of making allegations against the site accusing it of peddling pedophilia, Cooper could have served the public better if he had encouraged parents to explain to their kids that every time they post a picture somewhere, it is public by default and thus will run the risk of being misused by ill-intended folk out there. There is some validity in this argument. We must teach our kids the necessary media literacy required to navigate the cyberspace with all its glory and pitfalls.

What is striking about the Reddit case, however, is that it demonstrates that public discourse is rigorously negotiated both within the Reddit platform and beyond it. In this sense, treating these sites as isolated pockets of communities residing in different locations on the Internet would be taking a reductionist approach to the problem. Unlike Facebook’s invisible hand sanitizing its corporately owned public space, Reddit resembles an early Greek democracy where the Gods are mostly indifferent, or worse yet, abusive as it allegedly has been in the case of the r/lgbt reddit. Perhaps elaborating on the incidents that led to the policy change a bit further will clarify this point.

Cooper’s coverage of the Jailbait subreddit was undoubtedly shocking, but what led to the r/Jailbait’s closure was a different incident. A redditer, who went by the moniker TheContortionist, posted an image of his then underage ex-girlfriend technically in the nude. Unsurprisingly, the image was voted up with the clamors of “request-for-more” until the user gave in and posted another one in which the teen was clearly engaging in oral sex. Shocked Reddit users exposed TheContortionist’s post by voting it up to the front page of the site until finally, after a good six hours, the admins were forced to take it down. These images weren’t just posted on the forums, but were allegedly distributed through private messages. Faced with public outrage, Reddit reluctantly closed down the entire Jailbait section claiming that it was “threatening the structural integrity of the greater Reddit community.”

Although Jailbait was banned, Jailbait alternatives quickly sprung up under various other names. It was only a matter of time that another incident, this time in the “preteen_girls” subreddit, were to cause yet another public outcry. r/preteen_girls mostly featured images of 11 year-old girls in bikinis with sexually explicit captions. It was here that one of the users posted a screenshot of a naked underage girl from a banned film which quickly evoked the outrage of yet another message board residents. That message board was Something Awful (SA). The SA Goons (members of the SA forum) launched a campaign to label Reddit as a vibrant pedophile scene, urging users to contact churches, schools, local news, and law enforcement to put an end to this. And they won the battle.  Reddit responded with an explicit ban of “suggestive or sexual content featuring minors.”

Does this decision mean that Reddit has transformed from a non-interventionist organization into a policing one as claimed by one of the disappointed redditers? This is an important point to consider.

According to Reddit, the content it houses is self-policed, and as with similar sites, they really can’t regulate the quality of the content, nor should they need to under most circumstances. In essence, this is not very different from how Wikipedia and many other sites that rely on user-generated content operate. Except, the “self-police” part seems to be markedly dysfunctional in Reddit partially because users have little power over the content of the site except to notify a moderator and, predictably, moderators sometimes can be capricious, random, and inconsistent. Thus far, the site has been evaluating child porn content on a case-to-case basis, but the word on the street is that the admins don’t take much action when its users report these images or perhaps they are not swift enough when taking action. The jury is still out on that…

Democratic, for sure… But a laissez-faire approach to public discourse could have frightening outcomes in terms of freedom of speech if the “self-police” part does not work efficiently or the policing faction is abusing its powers. Abuse report, after all, is a click away and most sites give the right of way to the person who is reporting the abuse rather than examining the content in question. To be clear, when I am referring to frightening outcomes, I am not referring to the tasteless, offensive content that is being generated by our fellow kindred all over the planet. I mean the possibility that allowing illegal content being posted, or at least not taking swift action against it, could lead to inviting more government intervention in a space that we, the Internet denizens, hold sacred. The Reddit case, in this respect, presents a case study through which we could examine some of these issues.

When making this statement, I have the following in mind.

A month has passed since the Internet won its battle against the highly controversial bills, SOPA and PIPA, which were supported primarily by the media industry in its pursuit to crush the illegal transmission of copyrighted content. Advocating an Internet Blacklist Legislation and eliminating the safe harbor clause of DMCA, these bills threatened the very integrity of the Internet. The protest day was glorious and made unlikely bedfellows of various groups on the Internet. Reddit was one of them, so was Facebook. The day after this momentous victory, FBI raided the offices of the cyberlocker Megaupload and incarcerated its founder Kim Dotcom on racketeering, copyright infringement and money laundering charges. Two weeks after, Reddit child porn scandal erupts. Unrelated? Perhaps superficially, but the outcomes of such cases may lead to the same door. They bear the potential to invite excessive government regulation into a space we netizens hold so dear.

Similar discussions have taken place on virtual worlds whose destiny is closely tied to that of the Internet at large. The well-known law scholar Jack Balkin, for example, argues that design and play in virtual worlds should themselves be considered exercises of the right to speak and, thus, have constitutional significance.  Accordingly, he posits that much of what goes on in virtual worlds should be protected against state regulation by the First Amendment rights of freedom of expression and association. But the increasing amount of criminal activities and various communication torts that take place in these spaces, specifically copyright infringements, theft, and fraud, make the First Amendment doctrine less likely to be sufficient in fully protecting freedom in virtual worlds. Injured parties end up resorting to real-world courts to resolve their differences which ultimately ends up inviting government regulation into these spaces. In a similar fashion, Greg Lastowka and Dan Hunter state that virtual crimes will be of increasing concern for the communities engaging in the design and experience of virtual worlds as they resist external attempts at legal regulation (pg. 124).

Clearly, my goal is not to equate camel toes, breastfeeding, nudity, or offensive content with virtual crime because while the former is a valuable part of public discourse and, therefore, should be considered as protected speech either on Facebook or elsewhere, the latter, which in the aforementioned cases amount to copyright infringement, child pornography, money laundering, are indeed crimes. Make no mistake, governments would be swift to take action against them and the lobbyists would be there to coax them in the right direction. In fact, these considerations were (and still are) the driving forces behind SOPA, PIPA or ACTA. On the outset, these bills aim to quash copyright infringement but are threatening our freedom of speech in the process.

It is concerns such as these that dictate some of the hard lines that social media sites draw when regulating users’ freedom of speech. That hard-line is being negotiated among the netizens, activist groups, scholars, companies, lobbyists, politicians, and what have you. It is also being negotiated on a national and international scale. Reddit’s approach could be just as detrimental for the future of freedom of speech as Facebook’s. After all, how many times can you push against a door until it finally busts open and leads you to a path from which there is no return?

This article is cross-posted at


Lastowka, G., & Hunter, D. (2006). Virtual Crime. In Jack Balkin & Beth Simone Noveck (Eds.) The State of Play: Law, Games, & Virtual Worlds (121-136). New York & London: New York UP.

Balkin, J.M. Virtual liberty: freedom to design and freedom to play in virtual worlds. Available from