SOPA and the strategy of forced invisibility

Since I supported the blacking out of the MSR Social Media Collective blog to which I sometimes contribute, and the blacking out of Culture Digitally, which I co-organize, in order to join the SOPA protest led by the “Stop American Censorship” effort, the Electronic Frontier Foundation, Reddit, and Wikipedia, I though I should weigh in with my own concerns about the proposed legislation. 

While it’s reasonable for Congress to look for progressive, legislative ways to enforce copyrights and discourage flagrant piracy, SOPA (the Stop Online Piracy Act) and PIPA (the Protect IP Act) now under consideration are a fundamentally dangerous way to go about it. Their many critics have raised many compelling reasons for why [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16]. But in my eyes, they are most dangerous because of their underlying logic: policing infringement by rendering sites invisible.

Under SOPA and PIPA, if a website is even accused of hosting or enabling infringing materials, the Attorney General can order search engines to delete that site from their listings, require ISPs to block users’  access to it, and demand payment services (like PayPal) and advertising networks to cancel their accounts with it. (This last step can even be taken by copyright holders themselves, with only a good faith assertion that the site in question is infringing.) What a tempting approach to policing the Internet: rather than pursuing and prosecuting this site and that site, in an endless game of whack-a-mole, just turn the large-scale intermediaries, and use their power to make websites available, in order to make them unavailable. It shows all too plainly that the Internet is not some wide open, decentralized, unregulatable space, as some have believed. But, it undercuts the longstanding American tradition of how to govern information, which has always erred on the side of letting information, even abhorrent or criminal information, be accessible to citizens, so we can judge for ourselves. Making it illegal to post something is one thing, but wiping the entire site clean off the board as if it never existed is another.

Expunging an infringing site from being found is problematic in itself, a clear form of “prior restraint.” But it is exacerbated by the fact that whole sites might be rendered invisible on the basis of just bits of infringing content they may host. This is a particular troubling to sites that host user-generated content, where one infringing thread, post, or community might co-exist amidst a trove of other legitimate content. Under SOPA and PIPA, a court order  could remove not just the offending thread, but the entire site from Google’s search engine, from ISPs, and from ad networks, all in a blink.

These are the same strategies, not only that China, Iran, and Vietnam currently use to restrict political speech (as prominent critics have charged), but that were recently used against Wikileaks right here at home. When Amazon kicked Wikileaks off its cloud computing servers, when Wikileaks was de-listed by one DNS operator, when Mastercard and Paypal refused to take donations for the organization, they were attempting to render Wikileaks invisible before a court ever determined, or even alleged, that Wikileaks had broken any laws. So it is not a hypothetical that this tactic of rendering invisible will not only be dangerous for commercial speech, or the expressive rights of individual users, but for vital, contested, political speech. SOPA and PIPA would simply organize these tactics into a concerted, legally-enforced effort to erase, to which all search engines and ISP would be obligated to impose.

A lighthearted aside: In the film Office Space, the soulless software company chose not to fire the hapless Milton. Instead, they took away his precious stapler, moved him to the basement, and simply stopped sending him paychecks. We laughed at the blank-faced cruelty, because we recognized how tempting this solution would be, a deft way to avoid having to someone to their face. Congress is considering the same “Bobs” strategy here. But while it may be fine for comedy, this is hardly the way to address complex legal challenges around the distribution of information that should be dealt with in the clear light of a court room. And it risks rendering invisible elements of the web that might deserve to remain.

We are at a point of temptation. The Internet is both so powerful and so unruly because anyone can add their site to it (be it noble or criminal, informative or infringing) and it will be found. It depends on, and presumes, a principle of visibility. Post the content, and it is available. Request it, from anywhere in the world, and the DNS servers will find it. Search for it in Google, and it will appear. But, as those who find this network most threatening come calling, with legitimate (at least in the abstract) calls to protect children / revenue / secrets / civility, we will be sorely tempted to address these challenges simply by wiping them clean off the network.

This is why the response to SOPA and PIPA, most prominently in the January 18 blackouts by Reddit, Wikipedia, and countless blogs, are so important. Removing their content, even for a day, is meant to show how dangerous this forced invisibility could be. It should come as no surprise that, while many other Internet companies have voiced their concerns about SOPA, it is Wikipedia and Reddit that have gone the farthest in challenging the law. Not only do they host, i.e. make visible, an enormous amount of user-generated content. But they are themselves governed in important ways by their users. Their decisions to support a blackout were themselves networked affairs, that benefited from all of their users having an ability to participate — and recognized that commitment to openness as part of their fundamental mission.

Whether you care about the longstanding U.S. legal tradition of information freedoms, or the newly emergent structural logic of the Internet as a robust space of public expression, both require a new and firm commitment in our laws: to ensure that the Internet remains navigable, that sites remain visible, that pointers point and search engines list, regardless of the content. Sites hosting or benefitting from illegal or infringing content should be addressed directly by courts and law enforcement, armed with a legal scalpel that’s delicate enough to avoid carving off huge swaths of legitimate expression. We might be able to build a coalition of content providers and technology companies willing to partner on anti-piracy legislation, if copyright holders could admit that they need to go after the determined, underground piracy networks bent on evading regulation, and not in the same gesture put YouTube at risk for a video of a kid dancing to a Prince tune — there is a whole lot of middle ground there. But a policy premised on rendering parts of the web invisible is not going to accomplish that. And embracing this strategy of forced invisibility is too damaging to what the Internet is and could be as a public resource.

(Cross-posted at MSR’s Social Media Collective.)

Comments are closed.