The Background: “Dead and Buried”
At the end of 2013, a flare-up involving the work of Daniel Miller and his colleagues revealed a complex set of tensions regarding how digital culture scholars, journalists, and others in the technology sector address public engagement, prediction, and ethnographic methods. In this essay I use these tensions to investigate a widespread discourse where public engagement is linked to predicting success and failure. This discourse overlaps with longstanding debates over ethnographic methods but also with obsessions in the technology sector over the future (particularly with regard to profitability). This has implications for digital culture research and suggests there may be value in bolstering genres of public engagement cast in terms alongside those of success, failure, and prediction.
Daniel Miller, a well-known anthropologist of internet culture leading a multi-sited team studying social network sites,[i] published on November 24, 2013 the blogpost “What Will We Learn from the Fall of Facebook?,” summarizing research in progress on 16- to 18-year-olds in the United Kingdom. He touched briefly on predicting possible futures based on his data: “I don’t expect Facebook to necessarily disappear altogether. Rather it is finally finding its appropriate niche where it will remain. But I think it’s finished for the young in the UK and I suspect other countries will follow.” However, the bulk of his analysis addressed the present, as illuminated by ethnographic data: “For this group Facebook is not just falling, it is basically dead, finished, kaput, over. It is about the least cool thing you could be associated with on the planet.” Nearly one month later (on December 20), another version of this blogpost, rewritten by a journalist with Miller’s approval, appeared in another online forum. Entitled “Facebook’s So Uncool, but It’s Morphing Into a Different Beast,” this version included the phrase “What we’ve learned from working with 16-18 year olds in the UK is that Facebook is not just on the slide, it is basically dead and buried.”
In particular this second version of Miller’s post gained a fair share of incredulous media attention. An indicative example was “Facebook: Not Dead, Not Buried.” In this article, published on December 30, the journalist Rory Cellan-Jones stated “the man who sold, perhaps oversold, the story turns out to be Professor Miller.” Noting “I’ve seen plenty such stories over the years,” Cellan-Jones focused on method, asking “do interviews with some 16 to 18 year olds in one small area really tell us that young people are leaving Facebook ‘in their droves’ and herald a ‘sustained decline’? That seems quite a stretch—the plural of anecdote is not data, as the man said.” (Note that the phrase “sustained decline,” which implies prediction and was here seized on by Cellan-Jones, never appeared in Miller’s original post.)
On the same day of Cellan-Jones’s article, Miller posted a response to the coverage titled “Scholarship, Integrity, and Going Viral.” He emphasized “My blog post on ‘The Fall of Facebook’ was not so much about the decline of Facebook amongst schoolchildren as trying to understand what we can learn from this.” He added the journalist’s version of the blogpost “perhaps over-simplified the original.”
The Conceptual Significance: Prediction as Analysis
Full disclosure: as will be obvious, I am a longstanding admirer of Daniel Miller’s work, including the research under discussion. But while I think Miller is correct and Cellan-Jones incorrect (to the limited extent such a blunt assessment is meaningful), I also think that what is going on is not just oversimplification. The matter is more complex and the stakes far higher. Rather than seek to determine whose claims are true, let us identify the discursive field shared by all parties to the debate—a discursive field in which all digital culture scholars are implicated, including myself. This discursive field makes it possible to have these debates over truth and falsity in the first place: it allows digital culture to be, in Foucault’s terms, “constituted as a problem of truth.” What I find to be of particular concern is that within this discursive field we find commingled two ideas: first, that the popular value of digital culture research is effectively (even ideally) expressed in a language of predicting success and failure; and second, that ethnographic methods are doomed to failure because anecdotal.
So observing that Miller’s postings did (as he noted) “go viral,” the question should be: what precisely went viral? Why did these postings garner such attention? Clearly, the virality of the postings was linked to assessments of shifting popularity—despite the fact that this was not the focus of Miller’s posts! An analysis that emphasized the present was recast in a language of futurity. As noted above, Miller’s blogpost was originally titled “What Will We Learn from the Fall of Facebook?”, appearing as “Facebook’s So Uncool, but It’s Morphing Into a Different Beast” in its rewritten version. The key phrases seized on by others included statements that Facebook is basically dead, finished, kaput, over.
Predicting the future with regard to popularity is an obsession in entrepreneurial worlds and particularly the technology sector, where “trending” is a verb and corporations pay consultancy firms handsomely to foretell what will come. Indeed, oftentimes the most important consequences of such “predictions” is in regard to the present-day share values of companies like Facebook itself. Miller reflected on the place of prediction in his response: “looking back on my career as an academic I have rarely made predictions, partly because when I have, they have almost always turned out to be wrong”—yet in this case predicted his own prediction “will prove correct.” He added “I have another six months to continue this research, expanding on these findings but also exploring in much more detail why these trends develop and what we can learn from them.”
This is the discursive shift: the formerly entrepreneurial goal of prediction can now represent at least part of one’s goals as an ethnographer. While exploring a trend is obviously not Miller’s only (or even primary) research goal, as stated here it is something from which we are to learn. Analysis takes the form of prophecy, but prophecy of a specific kind: assessments of something’s popularity rising or falling in a linear fashion, often cast in a language of life and death (even burial). Now, all parties to the debate provided more nuanced analysis than these tropes suggest. Miller, for instance, emphasized he was talking about a decline specific to teens and also that these teens might keep their Facebook accounts for communicating with family members. Yet the framing by journalists worked against this nuance because it cast change in terms of growth or decline, once again paralleling languages of entrepreneurship. It may be worthwhile to ask how the salience of “youth” as a topic of study is informed by these temporal frameworks.
The Interdisciplinary Significance: Law and Meaning
Miller observed that he rarely made predictions earlier in his career “partly because… they have almost always turned out to be wrong.” But as Miller noted, this is only a partial explanation. There is a broader context in play: his former aversion to prediction is intelligible from a historical point of view. The past of anthropology provides an effective summary of this history, but the history is interdisciplinary and influences all domains of digital culture scholarship. In addition, it affects the corporate and popular perspectives that increasingly reframe scholarly observations on the present-day characteristics of digital culture as predictions of future success and failure.
As the discipline took form in the late nineteenth and early twentieth centuries, dominant anthropological paradigms emphasized understanding contemporary lifeworlds and the integration of various aspects of everyday existence into such lifeworlds. The primary disagreement was with evolutionary approaches that were in some cases explicitly linked to Social Darwinism and eugenics, but also positivist approaches seeking predictive laws. For instance, Malinowski and other “functionalists” asserted any element of culture serves to meet some need. This challenged, for instance, the earlier evolutionary paradigm of E. B. Tylor, who examined cultural “survivals” (like a child’s bow and arrow) that might not serve a function but could reveal cultural evolution.
In the United States and beyond, Franz Boas played a pivotal role in challenging evolutionary paradigms. In his classic discussion of Boas’s thought, George Stocking emphasized how Boas questioned these paradigms because they treated aspects of culture in isolation, like trying to understand the evolution of flutes without knowledge of the other orchestral instruments with which they are played, and the character of the music itself. More broadly, Boas (like many other intellectuals of the time) challenged the idea that discovering predictive laws was the only legitimate scholarly goal:
“Boas distinguished two different conceptions of the nature of scientific inquiry. Both had the same starting point: ‘the establishment of facts.’ Both had the same ultimate end: ‘to find the eternal truth.’ But their relationship to facts and their approach to truth were quite different. The difference was that between the ‘physical’ and the ‘historical’ methods. ‘The physicist compares a series of similar facts, from which he isolates the general phenomenon which is common to all of them. Henceforth the single facts become less important to him, as he lays stress on the general law alone.’ The historian, on the other hand, denied that the ‘deduction of laws from phenomena’ was the only approach to ‘eternal truth.’ There was also the method of ‘understanding,’ and for those who chose this route, the attitude toward the individual fact or event was quite different from the physicist’s: ‘Its mere existence entitles it to a full share of our attention; and the knowledge of its existence and evolution in space and time fully satisfies the student, without regard to the laws which it corroborates or which may be deduced from it’.”
Nearly a century later, Clifford Geertz echoed these sentiments when discussing how ethnographic analysis is scientific not in a positivist sense of discovering predictive laws, but in a modality of “clinical inference”:
“Rather than beginning with a set of observations and attempting to subsume them under a governing law, such inference begins with a set of (presumptive) signifiers and attempts to place them within an intelligible frame. [Such a mode of analysis] is not, at least in the strict meaning of the term, predictive. The diagnostician doesn’t predict measles; he decides that someone has them, or at the very most anticipates that someone is rather likely shortly to get them. But this limitation, which is real enough, has commonly been both misunderstood and exaggerated, because it has been taken to mean that cultural interpretation is merely post facto: that, like the peasant in the old story, we first shoot the holes in the fence and then paint the bull’s-eyes around them. It is hardly to be denied that there is a good deal of that sort of thing around, some of it in prominent places. It is to be denied, however, that it is the inevitable outcome of a clinical approach to the use of theory.”
The Rhetorical Significance: Trending Ethnography
The debate in late 2013 regarding the research of Daniel Miller and his colleagues illustrates a discursive shift in which temporality reenters and transforms ethnography (and other methods for digital culture scholarship). But unlike the functionalist (or in a very different way, structuralist) interventions that challenged evolutionary paradigms by focusing on the present, or the historicist interventions of Boas, Geertz, and others that reframed evolutionary paradigms by focusing on the past, the new interventions focus on the future. Note that as in the case discussed here, the scholars in question may not even be focusing on the future. The discursive field in question is not limited to anthropology or even social science research: it clearly more hegemonic in the technology sector itself, sometimes leading commentators to misinterpret scholarly claims about the present (or primarily about the present) as being wholly about future success or failure.
Rather than explain our past or interpret our present, this is a vision of scholarship (and, I cannot overemphasize, technology entrepreneurship) that takes trends as an object of analysis and the future as an analytical goal. It is an uncertain future: you never know when some company will invent another iPhone or Twitter, or when a formerly cutting-edge technology like MySpace or BlackBerry will go into decline, even vanish. When the goal of ethnography becomes at least in part to predict success or failure, digital culture scholars find themselves in an epistemic territory radically different from the canonical frameworks of Malinowski and Boas, or the evolutionary frameworks they challenged. There may be stronger affinities with some contemporary modes of evolutionary analysis that embrace contingency, used in some human sciences but also with regard to topics like climate change.
There have long been ways that ethnographic practice has employed prediction, and it certainly behooves digital culture scholars to consider how temporal arguments can productively shape public engagement as well as the research process. However, I do have concerns regarding prediction and particularly the prediction of success and failure. One concern is that predictions of success and failure could become seen as the sexiest and most fundable forms of scholarly engagement. What would this do to the kinds of questions asked, the methods used to address those questions, and the ways research is communicated to various publics? What has the emphasis on prediction already done in this regard?
As noted earlier, a second concern is that this discursive field, like all discursive fields, moves between and links ostensibly disparate cultural domains. Ethnography itself is one such domain, and it is striking to see how conceptual framing that characterizes both scholarly and youth assessments of Facebook’s future appears as well in debates over ethnographic methods. When Cellan-Jones claimed “the plural of anecdote is not data, as the man said,” that generic “man” who equates ethnography with anecdote could stand in for a number of digital culture scholars who approach ethnography via the same discursive field UK teenagers apparently use in understanding Facebook. For instance, in his posting “How Online Communities and Flawed Reasoning Sound a Death Knell for Qualitative Methods,” the economist Robert Bloomfield drew the following conclusion from a 2009 discussion with myself and several colleagues:
“Enterprising young scholars who are interested in cultural anthropology and are also trained in statistical methods are going to draw out testable predictions from the body of existing qualitative work, and test those predictions by applying experimental or econometric methods to data extracted from virtual worlds and social media. They will garner funding and publicity in the areas where they compete head to head with qualitative researchers, and the latter will be forced to defend their methods and conclusions…. Qualitative methods will either be relegated to less-prestigious schools and special-interest journals in cultural anthropology, or else cultural anthropology will decline in influence relative to other departments (like psychology) that embrace quantitative methods to study similar questions.”
Note the pivotal double role of prediction. First, Bloomfield asserts that testable predictions are the only valid approach for digital culture scholarship (and in so doing, assumes that qualitative and quantitative scholars “compete head to head” rather than collaborate). But second, Bloomfield redeploys this exact same language of prediction onto the domain of ethnographic methods. And it is specifically the prediction of success or failure (in this case, failure), the a language of a “death knell.” This is the same discursive field in which one can predict Facebook’s future as “dead and buried.”
I could multiply examples but the overall point should be clear.[i] I worry that like the UK teens discussed by Miller or so many technology “evangelists,” digital culture scholars might decide that particular phenomena like Facebook are the least cool thing we could be associated with on the planet. And in framing the phenomena in question as basically dead, finished, kaput, over, there is a risk that digital culture scholars (including myself) might reinforce a discursive field, strongly shaped by the technology sector, that emphasizes trends over analysis and context. Even when the diagnosis of success or failure is accurate, participation in this discursive field limits what can be said.
The Methodological Significance: Ethnography, Genre, and Time
The dangers of this discursive field are methodological as well as rhetorical. Indeed, they link up to broader concerns regarding what David Karpf has termed “internet time”—the fear that “In the time it takes to formulate, fund, conduct, revise, and publish a significant research question, we all are left to worry that changes in the media environment will render our work obsolete.”[ii] Framing one’s research goal in terms of prediction—and particularly in terms of predicting success or failure—might seem to be a way to forge anticipatory relevance, but it comes at a price. The future of ethnographic methods is not exterior to such a framing. Consider, for instance, George Marcus’s discussion of the “unbearable slowness” of ethnography in the contemporary period, recalling Karpf’s analysis of internet time.[iii] Given my own work on questions of method, it should not be surprising that I frequently end up advising graduate students from a range of institutions and disciplines who wish to engage in ethnographic studies online.[iv] What has shocked me is how often queries that begin “how can I do ethnographic research with this online community” become “how can I do ethnographic research in two months because I only have funding support for that long.”
The design of graduate programs is certainly germane to this sense of unbearable slowness, particularly as communications and media studies departments turn toward the digital. These departments often do not have an institutional history of providing for a year or more of ethnographic fieldwork. Other approaches predominate in those disciplines, so that ethnographic methods are misunderstood (most commonly by conflating them with interviews in isolation, thereby missing the central role of participant observation). But the misunderstanding has to do with temporality as well as institutional structures. Ethnographic methods are about learning a culture by making oneself vulnerable to being transformed by it. And “not unlike learning another language, [ethnographic] inquiry requires time and patience. There are no shortcuts.”[v]
Learning a language (or a culture) is not the same thing as predicting a language, and you cannot learn Japanese or Portuguese in a month—no software package can efface this reality. There is a world of difference between using Google Translate to translate another language, and speaking that language. It is a question of meaning, of understanding ways of living, not of prediction. As Miller noted in his response, ethnographic research “also means the countless informal encounters with people who live in the area. Of particular importance is direct observation and participation, so you know what people are doing and you don’t just rely on what they say they are doing.” Thus, “even as ethnography changes its modus operandi and its identity, there is nothing that suggests that the valuing of a patient, deliberate norm of temporality will not continue to be necessary.”[vi] The fact that Miller’s team has insisted on fifteen-month periods of coordinated ethnographic research is exemplary, when so often in the digital field we find notions of “ethnography lite” or “rapid ethnography.” But is the value of such sustained research recognized in the era of big data?
Such engagements are valuable because gaining cultural “fluency” is central to ethnographic inquiry: such inquiry gains its strength from building social relations with the persons and contexts studied. This permits collaborations and deeper connections that allow researchers to learn about forms of tacit knowledge that may never be explicitly discussed. This might be because the topics are taboo or sensitive, but more often it is that they are taken for granted. Language again illustrates this well: most speakers of any language cannot consciously describe they grammars they employ. For instance, most English speakers could not explain why they make an “s” sound for the plural “cats” but a “z” sound for the plural “dogs,” because they do not consciously know the distinction between voiced and unvoiced plosives and the phonological rule of assimilation in English that specifies the plural follows the voicing of the plosive that precedes it. Many aspects of culture—often the most significant ones—are similarly implicit, “unspoken” even when used in everyday interaction. One of the greatest contributions of ethnographic methods is to illuminate such tacit knowledges, which can be accessed in to a very limited degree (if at all) by interviews and focus groups that rely on explicit statements, or by “big data” methods that rely on the algorithmic discovery of patterns not shared beliefs. These tacit knowledges, the core of culture, are in the everyday present—not the future.
The discursive field I have discussed in this essay, framing truth in terms of predicting success or failure, contributes mightily to these dynamics. The question of genre is crucial here. Part of the response of digital culture scholars (including but beyond the ethnographic and anthropological traditions I focus on here) will involve asserting the validity of multiple genres of presenting research results. Critically, this means resisting attempts to conflate more informal presentations of work-in-progress with predictions of the future—a conflation shaped by the discursive field under discussion. Indeed, this essay is an example of the kind of informal scholarship that can ideally build engagements in multiple spheres, from academia to advocacy to industry, but only if the reality and value of multiple genres is kept in mind.
As Miller noted, “given the interest in our topic, [my research team keeps] a blog of interim findings and stories. We would prefer our final reports to go viral rather than our blog posts (there was no press release), but we now appreciate we have no control over this.” The discursive field I have discussed helps shape this sense of a lack of control, as well as the sense that public engagement best takes the form of predicting success or failure. It is this discursive field that contributes to the increasing pressure graduate student ethnographers (as well as more senior researchers like Miller) feel to publish before fieldwork is completed—despite the fact that historically this was the norm only in more laboratory-based disciplines. And it is this discursive field that threatens to drastically curtail the ask-able questions of ethnographic inquiry and its imaginable place in the study of digital culture.
The Conclusion: Which Future?
In this essay I have sought to identify a discursive field that conflates analysis with predicting success or failure. This discursive field overemphasizes unknowable futures and treats the search for predictive laws as the most valuable mode of inquiry. It resonates disturbingly not just with popular cultural and tech entrepreneurial obsessions with assigning value to the novel, but also with claims that ethnographic methods themselves are dead, finished, kaput, over.
Calls for digital culture scholars to engage with the public are increasingly being made in the language of this discursive field. This is dangerous because it limits what will be seen as a valid research finding and what genres will be acceptable for discussing research findings. It is also dangerous because this same discursive field has been used to delegitimate ethnographic inquiry by predicting its own failure. However, I am emphatically not saying that prediction is always problematic. It can be a valuable aspect of many research agendas, including those of ethnographers. To mention just one personal example, in my work on HIV/AIDS prevention in Indonesia epidemiological analysis was critical, and one component of such analysis was the prediction of future trends, which had consequences for program design. The problem is when this discursive field of predicting success or failure is construed as both the ideal mode of public engagement and the ideal mode of analytic significance.
As a case study to illustrate these issues I have summarized a debate over the work of my colleague Daniel Miller regarding social media, youth, and the future. At the risk of belaboring the obvious, I do not see Miller or anyone else discussed in this essay as incorrect, wrong, trapped in a discourse, or the like. Instead, I see all digital culture scholars (including myself) as shaped by this discursive field that frames truthmaking in terms of predicting success or failure. To do so is not necessarily incorrect or ineffective—rather, it is tricky and not the only valid path to knowledge.
The question, then, is “what does the future hold for ethnographic methods and their relevance”—or better, what kind of futurity do we seek?
-Tom Boellstorff is Professor of Anthropology at the University of California, Irvine. His current research is supported in part by the Intel Science & Technology Center for Social Computing.
Acknowledgments
I would like to thank Bill Maurer and Daniel Miller for their thoughtful comments on a draft of this essay. Their insights helped me refine my argument, underscoring the value of peer review even in more informal genres.
[i] Another example of such thinking is when the economist Edward Castronova advocated for the use of virtual worlds as experimental models by asserting that “the results are not based on the researcher’s impression after having spent 12 months living with a small subset of one of the populations… [this] mode of study is at least as reliable, and quite probably more so, than those that precede it . . . That being the case, a major realignment of social science research methods would seem to be in order” (Edward Castronova, “On the Research Value of Large Games: Natural experiments in Norrath and Camelot.” Games and Culture 1(2):163–86, 2006.)
[ii] David Karpf,” Social Science Research Methods in Internet Time.” Information, Communication, and Society 15(5):639–661, 2012.
[iii] George Marcus, “On the Unbearable Slowness of Being an Anthropologist Now: Notes on a Contemporary Anxiety in the Making of Ethnography.” XCP: Cross-cultural Poetics 12:7–20, 2003.
[iv] See Tom Boellstorff, Bonnie Nardi, Celia Pearce, and T.L. Taylor, Ethnography and Virtual Worlds: A Handbook of Method (Princeton: Princeton University Press, 2013).
[v] Renato Rosaldo, Culture and Truth: The Remaking of Social Analysis (Boston: Beacon Press, 1985), p. 25.
[vi] Marcus, “On the Unbearable Slowness of Being an Anthropologist Now,” pp. 16–17.