I wanted to circulate a new essay, “Governance of and by platforms,” set to appear in the SAGE Handbook of Social Media, edited by Jean Burgess, Thomas Poell, and Alice Marwick, being readied now for a 2017 publication. It also sets up the agenda of my forthcoming book, from Yale University Press, also set for next year. So, I very much welcome comments and suggestions.
ABSTRACT: Platforms rose up out of the exquisite chaos of the web. They were inspired by the freedom it promised, but they also had distinct and more guarded uses in mind. But as these platforms grew, the chaos they intended to escape found its way onto them, with the same motivation: if I want to say something, be it noble or reprehensible, I want to say it where everyone will hear me. These platforms put people in close proximity – and though the benefits of this may be obvious and sometimes utopian, the negatives are becoming painfully apparent as well.
Today we are by and large speaking from platforms when we’re online. By platforms, I mean sites and services that host public expression, store it on and serve it up from the cloud, organize access to it through search and recommendation, or install it onto mobile devices. This includes Facebook, YouTube, Twitter, Tumblr, Pinterest, Google+, Instagram, and Snapchat… but also Google Search and Bing, Apple App Store and Google Play, Medium and Blogger, Foursquare and Nextdoor, Tinder and Grindr, Etsy and Kickstarter, Whisper and Yik Yak. What unites them is their central offer: to host and organize user content for public circulation, without having produced or commissioned it. They don’t make the content, but they make important choices about that content: what they will distribute and to whom, how they will connect users and broker their interactions, and what they will refuse. With this growing and increasingly powerful set of digital intermediaries, we have to revisit difficult questions about how they structure the speech and social activity they host, and what rights and responsibilities should accompany that powerful position.
This essay begins by discussing the governance of platforms: the policies that have emerged in the past decade specifying platform liabilities (or lack thereof) for the user content and activity they host. In the U.S., these regulations are limited by a fundamental reluctance to constraint speech, whereas internationally, these same platforms face a wider array of restrictions. It will then consider governance by platforms — related to the first, but not the same. Social media platforms have taken on the responsibility of curating the content and policing the activity of their users: not simply to meet legal requirements, or to avoid having additional policies imposed, but also to avoid losing offended or harassed users, to placate advertisers eager to associate their brands with a healthy online community, to protect their corporate image, and to honor their own personal and institutional ethics. Some of these interventions are welcomed by users, while others have been more contentious.
The regulatory framework we impose on platforms, and the ways in which the major platforms enact those obligations and impose their own on their users, are settling in as the parameters for the how public speech online is and will be privately governed.