Content moderation is not a panacea: Logan Paul, YouTube, and what we should expect from platforms

What do we expect of content moderation? And what do we expect of platforms?

There is an undeniable need, now more than ever, to reconsider the public responsibilities of social media platforms. For too long, platforms have enjoyed generous legal protections and an equally generous cultural allowance, to be “mere conduits” not liable for what users post to them. in the shadow of this protection, they have constructed baroque moderation mechanisms: flagging, review teams, crowdworkers, automatic detection tools, age barriers, suspensions, verification status, external consultants, blocking tools. They all engage in content moderation, but are not obligated to; they do it largely out of sight of public scrutiny, and are held to no official standards as to how they do so. This needs to change, and it is beginning to.

But in this crucial moment, one that affords such a clear opportunity to fundamentally reimagine how platforms work and what we can expect of them, we might want to get our stories straight about what those expectations should be.

The latest controversy involves Logan Paul, a twenty-two year old YouTube star with 15 million plus subscribers. His videos, a relentless barrage of boasts, pranks, and stunts, have garnered him legions of adoring fans. But he faced public backlash this week after posting a video in which he and his buddies ventured into the Aokigahara forest of Japan, only to find the body of a young man who had recently committed suicide. Rather than turning off the camera, Logan continued his antics, pinballing between awe and irreverence, showing the body up close and then turning the attention back to his own reaction. The video lingers of the body, including close ups of his swollen hand, and Paul’s reactions were self-centered and cruel. After a blistering wave of criticism in the video comments and on Twitter, Paul removed the video and issued a written apology, which was itself criticized for not striking the right tone. A somewhat more heartfelt video apology followed. He later announced he would be taking a break from YouTube.

There is no question that Paul’s video was profoundly insensitive, an abject lapse in judgment. But amidst the reaction, I am struck by the press coverage of and commentary about the incident: the willingness both to lump this controversy in with an array of other concerns about what’s online, as somehow all part of the “content moderation” problem; paired with a persistent and unjustified optimism for what content moderation should be able to handle.

YouTube has weathered a series of controversies over the course of the last year, many of which had to do with children, both their exploitation and their vulnerability as audiences. There was the controversy about popular vlogger PewDiePie, condemned for including anti-Semitic humor and Nazi imagery in his videos. Then there were the videos that slipped past the stricter standards YouTube has for its Kids app: amateur versions of cartoons featuring well-known characters with weirdly upsetting narrative third acts. That was quickly followed by the revelation of entire YouTube channels of videos in which children were being mistreated, frightened and exploited, that seem designed to skirt YouTube’s rules against violence and child exploitation. And just days later, Buzzfeed also reported that YouTube’s autocomplete displayed results that seemed to point to child sexual exploitation. YouTube representatives have apologized for all of these, promised to increase the number of moderators reviewing their videos, aggressively pursue better artificial intelligence solutions, and remove advertising from some of the questionable channels.

Content moderation, and different kinds of responsibility

But what do these incidents have in common, besides the platform? Journalists and commentators are eager to lump them together: part of a single condemnation of YouTube, its failure to moderate effectively, and its complicity with the profits made by producers of salacious or reprehensible content. But these incidents represent different kinds of problems, they implicate YouTube and content moderation in different ways — and, when lumped together, they suggest a contradictory set of expectations we have for platforms and their public responsibility.

Platforms assert a set of normative standards, guidelines by which users are expected to comport themselves. It is difficult to convince every user to honor these standards, in part because the platforms have spent years promising users an open and unfettered playing field, inviting users to do or say whatever they want. And it is difficult to enforce these standards, in part because the platforms have few of the traditional mechanism of governance: they can’t fire us, we are not salaried producers. All they have are the terms of service and the right to delete content and suspend users. And, there are competing economic incentives for platforms to be more permissive than they claim to be, and to treat high value producers differently than the rest.

Incidents like the exploitative videos of children, or the misleading amateur cartoons, take advantage of this system. They live amidst this enormous range of videos, some subset of which YouTube must remove. Some come from users who don’t know or care about the rules, or find what they’re making perfectly acceptable. Others are deliberately designed to slip past moderators, either by going unnoticed or by walking right up to but not across the community guidelines. They sometimes require hard decisions about speech, community, norms, and the right to intervene.

Logan Paul’s video, or PewDiePie’s racist outbursts, are of a different sort. As was clear in the news coverage and the public outrage, critics were troubled by Logan Paul’s failure to consider his responsibility to his audience, to show more dignity as a videomaker, to choose sensitivity over sensationalism. The fact that he has 15 million subscribers, many of them young, was reason for many to claim that he (and by implication, YouTube) have a greater responsibility. These sound more like traditional media concerns: the effects on audiences, the responsibilities of producers, the liability of providers. This could just as easily be a discussion about Ashton Kutcher and an episode of Punk’d. What would Kutcher’s, his production team’s, and MTV’s responsibility be if he had similarly crossed the line with one of his pranks?

But MTV was in a structurally different position than YouTube. We expect MTV to be accountable for a number of reasons: they had the opportunity to review the episode before broadcasting it; they employed Kutcher and his team, affording them specific power to impose standards; and they chose to hand him the megaphone in the first place. While YouTube also affords Logan Paul a way to reach millions, and he and YouTube share advertising revenue from popular videos, these offers are in principle made to all YouTube users. YouTube is a distribution platform, not a distribution bottleneck — or it is a bottleneck of a very different shape. This does not mean we cannot or should not hold YouTube accountable. We could decide as a society that we want YouTube to meet exactly the same responsibilities as MTV, or more. But we must take into account that these structural differences change not only what YouTube can do, but how and why we can expect it of them.

Moreover, is content moderation the right mechanism to manage this responsibility? Or to put it another way, what would the critics of Logan’s video have wanted YouTube to do? Some argued that YouTube should have removed the video, before Paul did. (It seems the video was reviewed, and was not removed, but Paul received a “strike” on his account, a kind of warning — we know this only based on this evidence. If you want to see the true range of disagreement about what YouTube should have done, just read down the lengthy thread of comments that followed this tweet.) In its PR response to the incident, a YouTube representative said it should have taken the video down, for being “shocking, sensational or disrespectful”. But it is not self-evident that Paul’s video violates YouTube’s policies. And from the comments from critics, it was Paul’s blithe, self-absorbed commentary, the tenor he took about the suicide victim he found, as much as showing the body itself, that was so troubling. Showing the body, lingering on its details, was part of Paul’s casual indifference, but so were his thoughtless jokes and exaggerated reactions. Is it so certain that YouTube should have removed this video on our behalf? I do not mean to imply that the answer is no, or that it is yes. I’m only noting that this is not an easy case to adjudicate — which is precisely why I we shouldn’t expect YouTube to already have a clean and settled policy towards it.

There’s no simple answer as to where such lines should be drawn. Every bright line rule YouTube might draw will be plagued with “what abouts”. Is it that corpses should not be shown in a video? What about news footage from a battlefield? What about public funerals? Should the prohibition be specific to suicide victims, out of respect? It would be reasonable to argue that YouTube should allow a tasteful documentary about the Aokigahara forest, concerned about the high rates of suicide among Japanese men. Such a video might even, for educational or provocative reasons, include images of the body of a suicide victim, or evidence of their deaths. In fact, YouTube already has some, of a variety of qualities (see 1, 2, 3, 4).

So what we critics may be implying is that YouTube should be responsible to distinguish the insensitive versions from the sensitive ones. Again, this sounds more like the kinds of expectations we had for television networks — which is fine if that’s what we want, but we should admit that this would be asking much more from YouTube than we might think.

As a society, we’ve already struggled with this very question, in traditional media: should the news show the coffins of U.S. soldiers as their returned from war? should the news show the grisly details of crime scenes? When is the typically too graphic video acceptable because it is newsworthy, educational, or historically relevant? Not only is the answer far from clear, and differs across cultures and periods. As a society, we need to engage in the debate; it cannot be answered for us by YouTube alone.

These moments of violation serve as the spark for that debate. It may be that all this condemnation of Logan Paul, in the comment threads on YouTube, on Twitter, and in the press coverage, is the closest we get to a real, public consideration of what’s appropriate for public consumption. And maybe the focus among critics on Paul’s irresponsibility, as opposed to YouTube’s, is indicative that this is not a moderation question, or a growing public sense that we cannot rely on YouTube’s moderation, that we need to cultivate a clearer sensibility of what public culture should look like, and teach creators to take their public responsibility more seriously. (Though even if it is, there will always be a new wave of twenty-year-olds waiting in the wings, who will jump at the chance social media offers to show off for a crowd, way before they ever grapple with social norms we may have worked out. This is why we need to keep having this debate.)

How exactly YouTube is complicit in the choices of its stars

This is not to suggest that platforms bear no responsibility for the content that they help circulate. Far from it. YouTube is implicated, in that they afford the opportunity for Logan to broadcast his tasteless video, help him gather millions of viewers who will have it instantly delivered to their feed, design and tune the recommendation algorithms that amplify its circulation, and profit enormously from the advertising revenue it accrues.

Some critics are doing the important work of putting platforms under scrutiny, to better understand the way producers and platforms are intertwined. But it is awfully tempting to draw too simple a line between the phenomenon and the provider, to paint platforms with too broad a brush. The press loves villains, and YouTube is one right now. But we err when we draw these lines of complicity too cleanly. Yes, YouTube benefits financially from Logan Paul’s success. That by itself does not prove complicity; it needs to be a feature of our discussion about complicity. We might want revenue sharing to come with greater obligations on the part of the platform; or, we might want platforms to be shielded from liability or obligation no matter what the financial arrangement; or, we might want equal obligations whether there is revenue shared or not; or we might want obligations to attend to popularity rather than revenue. These are all possible structures of accountability.

It is also easy to say that YouTube drives vloggers like Logan Paul to be more and more outrageous. If video makers are rewarded based on the number of views, whether that reward is financial or just reputational, it stands to reason that some videomakers will look for ways to increase those numbers, including going bigger. But it is not clear that metrics of popularity necessarily or only lead to being over more outrageous, and there’s nothing about this tactic that is unique to social media. Media scholars have long noted that being outrageous is one tactic producers use to cut through the clutter and grab viewers, whether its blaring newspaper headlines, trashy daytime talk shows, or sexualized pop star performances. That is hardly unique to YouTube. And YouTube videomakers are pursuing a number of strategies to seek popularity and the rewards therein, outrageousness being just one. many more seem to depend on repetition, building a sense of community or following, interacting with individual subscribers, and the attempt to be first. While over-caffeinated pranksters like Logan Paul might try to one-up themselves and their fellow bloggers, that is not the primary tactic for unboxing vidders or Minecraft world builders or fashion advisers or lip syncers or television recappers or music remixers. Others see Paul as part of a “toxic YouTube prank culture” that migrated from Vine, which is another way to frame YouTube’s responsibility. But a genre may develop, and a provider profiting from it may look the other way or even encourage it; that does not answer the question of what responsibility they have for it, it only opens it.

To draw too straight a line between YouTube’s financial arrangements and Logan Paul’s increasingly outrageous shenanigans misunderstands both of the economic pressures of media and the complexity of popular culture. It ignores the lessons of media sociology, which makes clear that the relationship between the pressures imposed by industry and the creative choices of producers is much more complex and dynamic. And it does not prove that content moderation is the right way to address this complicity.

*   *   *

Let me say again: Paul’s video was in poor, poor taste, and he deserves all of the criticism he received. And I find this genre of boffo, entitled, show-off masculinity morally problematic and just plain tiresome. And while it may sound like I am defending YouTube, I am definitely not. Along with the other major social media platforms, YouTube has a greater responsibility for the content they circulate than they have thus far acknowledged; they have built a content moderation mechanism that is too reactive, too dismissive, and too opaque, and they are due for a public reckoning. In the last few years, the workings of content moderation and its fundamental limitations have come to the light, and this is good news. Content moderation should be more transparent, and platforms should be more accountable, not only for what traverses their system, but the ways in which they are complicit in its production, circulation, and impact. But it also seems we are too eager to blame all things on content moderation, and to expect platforms to maintain a perfectly honed moral outlook every time we are troubled by something we find there. Acknowledging that YouTube is not a mere conduit does not imply that it is exclusively responsible for everything available there.

As Davey Alba at Buzzfeed argued, “YouTube, after a decade of being the pioneer of internet video, is at an inflection point as it struggles to control the vast stream of content flowing across its platform, balancing the need for moderation with an aversion toward censorship.” This is true. But we are also at an inflection point of our own. After a decade of embracing social media platforms as key venues for entertainment, news, and public exchange, and in light of our growing disappointment in their preponderance of harassment, hate, and obscenity, we too are struggling: to modulate exactly what we expect of them and why, to balance how to improve the public sphere with what role intermediaries can reasonably be asked to take.

This essay is cross-posted at Social Media Collective. Many thanks to Dylan Mulvin for helping me think this through.