Here’s a quick thought that comes from my dissertation. It deals with Newsvine, one of the sites on which I did fieldwork, but it involves no inside knowledge of the site. It was originally posted on my personal site, but Hector’s recent comment interrogating the nature of algorithms inspired me to cross-post it. I’d also mention a number of people, including C.W. Anderson and Stuart Geiger as a few other scholars engaged in building what Anderson calls a “sociology of algorithms.”
On a site like Newsvine, when enough users report a story or a comment as abuse, it is automatically “collapsed” by the system, meaning that the text of the comment, while not removed altogether, is hidden by default (users who wish to read a collapsed comment can click its heading to reveal the text).
It’s worth noting that any reasonably effective comment collapsing algorithm is a heterogeneous product. When debates get heated, some comments may simultaneously be voted up by some users and reported as abusive by others; to work effectively a collapsing script has to operate on a sliding scale so that as an article draws more attention and registers more ratings in aggregate, it also takes more negative votes to bring down a comment or the story itself. Common sense tells us that five abuse reports on an article that’s only been rated eight times should probably be collapsed. But how about a comment with a thousand total ratings that’s been flagged five times?
Creating a collapsing algorithm that intelligently weighs the balance of votes for and against a comment is a tricky challenge, simultaneously technical and social. John Law, in examining the engineering equations behind contemporary aircraft design, underscored the manner in which equations that on their face purported to simply explain the physics of lift, wing contouring, and airspeed, were ultimately as much or more about the comfort and safety of crews and pilots. In his story, a neat, concise equation for acceptable values of “gust response” appears deceptively simple and technically oriented at the end of the day, hiding the range of concerns that went into its production:
Removed from the flat space occupied by the formalism, we find ourselves in the sweating world of the aircrew. We discover pilots who flew their creaking aircraft too low, pilots who worried about whether the wings would break off, pilots who were thrown about their cockpits, pilots who climbed shaking from their aircraft at the end of these flights. If we are imaginative, then perhaps we can smell the fear, feel the sweat on the bodies, the taste of vomit. (p. 123)
Bruno Latour refers to this process, through which the messy and diverse world of experience is reduced to an equation, an engineering diagram, an algorithm, as deflation. While Newsvine is far removed from the life and death world of experimental aviation, we can similarly understand moderation algorithms as concealing a great deal of hard won experience, and containing within them a set of working assumptions about what we tend to think of as fundamentally social problems: the extent to which controversy should be tolerated and/or valued in discourse, as well as the perceived nature and values of the community flagging and voting on those comments. They are thus heterogeneous technologies. Automated moderation systems at times seem deceptively simple, but thinking about them in this way reveals a surprising amount of complexity at work.
-Contributed by Josh Braun, Quinnipiac University Department of Film, Video, and Interactive Media-