Dialogue: reflecting on Chapter 5 of John Gilliom and Torin Monahan’s SuperVision: An Introduction to the Surveillance Society

A few weeks ago, Torin Monahan shared a chapter from his new book with John Gilliom, SuperVision: An Introduction to the Surveillance Society, published by the University of Chicago Press. (You can still read chapter 5 of the book here.) A number of Culture Digitally participants commented, and Torin had a chance to respond. Rather than losing this discussion in the comment thread, we thought we’d re-post it here.


Casey O’Donnell:

Torin and John, thanks for sharing this chapter with the #cultd community. I found the material compelling on multiple levels, but I’d like to ask about one direction in particular. Clearly, the implications of surveillance in the workplace are extremely important, but I wanted to ask about your encounters with increasingly algorithmic or “gameified” surveillance. I think you’re actually plugging into precisely some of the reasons game developers have reacted quite negatively to gameification as it plugs into these broader logics that developers are quite apprehensive about.

As surveillance expands, the amount of data (“metrics”) will continue to grow, an increasingly be less parseable by individual humans (going back and correlating email messages with key-fob door entry data) and their monitoring will fall to algorithms. Tarleton may want to chime in, but I see all of these as critically related.

The other, more tangential, element I’m curious about is the “gaming” of these systems. Call centers in particular are fascinating, and indeed for many that work in them they are crushing/grinding systems. Yet, I’ve known some that identify the way the systems function and how they are evaluated and game them. Of course, getting caught in such an act would likely endager a worker’s livelihood, but I’m fascinated by the ability for people to always exceed these systems.

I’m certainly not saying that makes them any less problematic, but that they will always ultimately fail to discipline all players and that the fantasy that those who imagine these systems will provide a new kind of perfectly managed workplace. It’s much like attempting to stomp out all instances of cheating in a game or the piracy of data. Systems/algorithms are always an abstraction with perspective, and as such they never really deliver on their panoptic promise. Maybe they even only perpetuate greater expense and diminishing returns.


Lee Humphreys:

Casey, I too was super interested in the gaming of the surveillance systems. It was great to see a book by surveillance experts that’s written for a non-academic audience. As academics we often speak to a very close group of colleagues and scholars who often share intellectual motivations, assumptions, and experiences. Therefore I was very excited to get a chance to read a chapter.

“Watching You Work” highlighted some interesting tensions of surveillance mechanisms, particularly around the use of technologies in the workplace for surveillance that were not designed to be used that way. The chapter nicely revealed important but subtle ways that mechanisms of surveillance could be used against people in their work environment.

The two questions that arose for me when reading this chapter were about the actors. First, how can we characterize the “you” in this chapter? The preface suggests the book was written for students, friends, and neighbors, which I interpreted as a general public. The discussions of Talyorism and Fordism suggests a more narrow “you” — you are the laborer, the worker, the employee, the Anna, the Nurse Betty, etc. But the chapter also suggests that Taylorism brought about the need for a class or group of individuals to manage, to plan, and evaluate the work being done by others. These mid-level managers are not part of the “you” addressed and defined in this chapter.

For me this raised a second questions, which is, what would it mean to write a book about surveillance for those who survey, not merely about those who are the objects of surveillance? If one of the goals is to reveal surveillance mechanisms to “give readers the tools to understand and critically engage” with them, then to address those mid-level managers who are charged with increasing efficiency and productivity outcomes could also be a worthwhile audience to address. How might a book about surveillance capture the experience of those who survey and help them to more critically reflect on their evaluations of others?

I can’t help but think that some people are uncritically using surveillance metrics and technologies to respond to pressure they themselves are feeling to perform in the workplace. Is power being usurped from them into a technological system, which informs them who of their employees is “good” and who is “bad”?

Sometimes these mid-level managers are charged with the development and implementation of surveillance mechanisms. What would it mean to write a book to help them see the ways that surveillance undermines the efficacy of employees? If this book is meant to reveal, to question and to make the reader, who hasn’t thought about surveillance, begin to think about it, then why not engage with this other potential audience of the surveyor.

There are organizations that strive to empower their employees because they see the benefits of it; for example, organizations that do “360” evaluations where bosses, peers, and employees evaluate each other. How might these work environments be unintentionally employing surveillance strategies that undermine their collective values and goals? I know many people who are managers of others in one way or another. None of them are “the boss” but they are “a boss”. How might their stories reveal another side of work and the surveillance society? How might they be gaming the mechanisms of surveillance as well?


Torin Monahan:

Thank you, Casey and Lee, for these terrific questions and comments. Concerning the question about how algorithmic surveillance might manifest, I’d say that it plays a key role in many unfolding, and overlapping, surveillance systems. (Perhaps algorithmic surveillance is more clearly presented in the book’s chapters on online worlds and security systems—where we discuss Google, DHS fusion centers, and data aggregators, among other examples.) Still, this is a challenge we wrestled with: How does one communicate the importance of opaque, automated operations that are nonetheless hugely consequential? For a chapter like “Watching You Work,” we opted to foreground various mechanisms of workplace surveillance and try to situate them within a historical context, while at the same time flagging the intensification of surveillance made possible by the development and adoption of new technological systems. So, the example we give of surveillance at call centers shows the automated triage of customers based on their perceived importance, while, at the same time, the system controls call-center employees by automating their dialing and capturing fine-grained data on their performance. Actually, algorithmic surveillance is present in many forms of performance monitoring (whether of hotel cleaning staff, nurses, casino dealers, or others). When social sorting occurs through automated processes, it becomes vital to critique the inequalities that persist and are normalized by such systems. I’d definitely love to hear some of the pedagogical approaches that others have found for getting students to engage with these (sometimes) abstract issues.

The gaming of surveillance systems also presents a rich dimension of surveillance societies. It’s certainly something John Gilliom and I get the most excited about, in part because the creativity and agency revealed by such forms of resistance undermine any totalizing notions of control. The gaming of systems also oftentimes prompts new developments in surveillance, which is another good reason to take these practices seriously. For instance, in the sample chapter we see workers gaming the various systems by sending false emails about clock-in times, using hotel-room phones to register that rooms had been cleaned when they had not, putting customers on hold to buy some extra time between calls, cutting off hospital patients’ wristbands to decrease the workload, etc. Almost every one of these acts of resistance, once discovered by employers, engenders a new mechanism of control. While John and I have been somewhat optimistic about the potential of resistance in the past, near the end of the book we started leaning toward a need for deeper structural (or infrastructural) changes to capitalize on the empowering possibilities for surveillance while holding the disempowering developments in check. What that might look like, though, would probably require another book altogether!

Lee called attention to our imagined audience and asked, “What would it mean to write a book about surveillance for those who survey, not merely those who are the objects of surveillance?” The implied “you” of this book is something we agonized over. Not so much because of one’s position in workplace hierarchies, necessarily, but more so because of assumed cultural privilege or universalism, which would surely fail to account sufficiently for difference, especially on a global scale. Part of what we were trying to do in this short book was to illustrate the many nested relationships of surveillance across multiple domains of social life. It becomes much more difficult to ask questions about how one can—or should—surveil responsibly, or with “care” for those being observed and guided. I can say that there are some really good qualitative studies of police and security camera-room operators, for instance, who as workers and managers often exercise very little power and are caught in jurisdictional turf wars; they are certainly not the all-powerful or all-seeing control agents that they are often presented to be. (I’m thinking here of the work of Benjamin Goold, Gavin Smith, and Peter Fussey.) The literature on surveillance of children is also productively complicated in these ways, forcing understandings of surveillance more toward the “care” end of the care-control spectrum (e.g., the work of Valerie Steeves and Margaret Nelson). But aside from how-to management books, it’s less obvious what a critical book for managers would look like, especially as trust seems to be diminishing along with job security across many sectors, while digital surveillance facilitates the management of outsourced and homesourced jobs.

As with “ethics” education for other professionals, such as engineers or physicians, I’d probably place little faith in the likelihood of responsible surveillance guidelines altering radically the data-collection and risk-management imperatives of institutions. That’s not to say that resistance isn’t occurring, as it certainly is, but aside from organized efforts such as those of labor unions, will the individual efforts of well-intentioned, mid-level managers be able to stem the flood of workplace surveillance? In truth, I’m not that hopeful.


Sam Srauy:

Like Lee and Casey before me, I really like this chapter as well. Again, thanks for sharing with our community! I had the same suspicion that Lee had — the possibility that those in middle management were also subjected to disciplinary currents. What brought that home for me is the example of Patricia Dunn. That Dunn apparently spied on members of the board clearly demonstrates that surveillance extends beyond the level of labor.

Considering that along with the next section about the war on drugs, I had to wonder: To what extent is this form of surveillance normalized? Can we say that it is ubiquitous? Because you tie surveillance to the war on drugs in the 1980s, I think you’re saying that it is ubiquitous — or at least normalized by apparently non-related discourses.

Of course, all of this makes me wonder, how can we resist? Of course, those in power (whoever they happen to be at any point in time an circumstance) ultimately get to say how these practices work. As you point out, since groups that typically offer resistance (unions) have been losing power and it’s still too soon to see what role social media et cetera play, I wonder where resistance can happen? Or are they only in corporate practices such as the 360 evaluations Lee mentioned?

At any rate, I am very happy that you have “pulled back the curtain” for non-academics (and academics who aren’t specialists in this area). I can’t wait to read the whole book.


Nick Couldry:

Thanks to Torin for his eloquent and lucid chapter. It introduces a whole range of current surveillance practices now affecting the standard workplace.

I was particularly interested in the story it uncovered about how the ‘facts’ of the workplace are changing. Both at the start of the chapter and the firing example example from the discussion of hospital surveillance systems (page 98), it becomes clear that the data array on which employers can draw to establish the basic ‘facts’ of an employment relationship is changing radically.

Obvious in one way, but profound in its consequences: however non-consensual the establishment of a surveillance system, its ‘facts’ generate a basis for employer action (including an automatic rationale for particular evaluations of employees) that is non-negotiable, because factual. This in turn provides a justification for immediate judgements and irreversible actions – firing employees – even though there has never been a discussion about whether there exist clear norms for how employees should adapt to the fact of continual surveillance. Nor is there any consensus about whether the asymmetrical information access that surveillance provides is acceptable in all of the domains where it is applied (for example social media where people’s informal lives are on display to apparently limited groups of people).

Surveillance extends the normal asymmetry of workplace power relationships into work times and leisure times that were previously immune from it, providing a massively more extensive data array on the basis of which non-contestable and ‘rational’ decisions can be taken without any possible recourse. When combined with big data capacities, being developed for other purposes, the result is even more disturbing.