"More than machinery, we need humanity."
The announcement that Mark Zuckerberg and Priscilla Chan would be donating $300 million to help address some of the challenges COVID-19 poses for the 2020 elections was met with a great deal of derision. The scorn was not directed at the effort to recruit poll workers, or purchase PPE for them, but at the source from whence these funds were coming. Having profited massively from allowing COVID-19 misinformation to run rampant over Facebook, and having shirked responsibility as the platform exacerbated political tensions, the funding announcement came across not only as too little too late, but as a desperate publicity stunt. The incident was but another installment in Facebook’s tumult as the company (alongside its CEO/founder) continually finds itself cast as a villain. Facebook can take some solace in knowing that other tech companies—Google, Amazon, Uber—are also receiving increasingly negative attention, and yet it seems that for every one critical story about Amazon there are five harsh pieces about Facebook.
Where Facebook, and Zuckerberg, had once enjoyed laudatory coverage, with the platform being hailed as an ally of democracy, by 2020 it has become increasingly common to see Facebook (and Zuckerberg) treated as democracy’s gravediggers. Indeed, much of the animus found in the increasingly barbed responses to Facebook seem to be animated by a sense of betrayal. Many people, including more than a few journalists and scholars, had initially been taken in by Facebook’s promises of a more open and connected world, even if they are loathe to admit that they had ever fallen for that ruse now. Certainly, or so the shift in sentiment conveys, Facebook and Zuckerberg deserve to be angrily upbraided and treated with withering skepticism now… but who could have seen this coming?
“Technologies are not merely aids to human activity, but also powerful forces acting to reshape that activity and its meaning” (6). When those words were first published, in 1986, Mark Zuckerberg was around two years old, and yet those words provide a more concise explanation of Facebook than any Facebook press release or defensive public speech given by Zuckerberg. Granted, those words were not written specifically about Facebook (how could they have been?), but in order to express a key insight about the ways in which technologies impact the societies in which they are deployed. The point being not only to consider how technologies can have political implications, but to emphasize that technologies are themselves political. Or to put it slightly differently, Langdon Winner was warning about Facebook before there was a Facebook to warn about.
More than thirty years after its initial publication, The University of Chicago Press has released a new edition of Langdon Winner’s The Whale and the Reactor. Considering the frequency with which this book, particularly its second chapter “Do Artifacts Have Politics?,” is still cited today, it is hard to suggest that Winner’s book has been forgotten by scholars. And beyond the academy, those who have spent even a small amount of time reading some of the prominent recent STS or media studies works will have likely come across his name. Therefore, the publication of the this second edition—equipped with a new preface, afterword, an additional chapter, and a spiffy red cover—represents an important opportunity to revisit Winner’s work. While its citational staying power suggests that The Whale and the Reactor has become something of an essential touchstone for works on the politics of technological systems, the larger concerns coursing through the book have not lost any of their weight in the years since the book was published.
For at its core The Whale and the Reactor is not about the types of technologies we are making, but about the type of society we are making.
Divided into three sections, The Whale and the Reactor wastes no time in laying out its central intervention. Noting that technology had rarely been treated as a serious topic for philosophical inquiry, Winner sets about arguing that an examined life must examine the technological systems that sustain that life. That technology has so often been relegated to the background has given rise to a sort of “technological somnambulism” whereby many “willingly sleepwalk” as the world is technologically reconfigured around them (10). Moving forward in this dreamy state, the sleepers may have some vague awareness of the extent to which these technological systems are becoming interwoven into their daily lives, but by the time they awaken (supposing they ever do awaken) these systems have accumulated sufficient momentum as to make it seemingly impossible to turn them off at all. Though The Whale and the Reactor is not a treatise on somnambulism, this characterization is significant insofar as a sleepwalker is one who staggers through the world in a state of unawareness, and thus cannot be held truly responsible. Contrary to such fecklessness, the argument presented by Winner is that responsibility for the world being remade by technology is shared by all those who live in that world. Sleepwalking is not an acceptable excuse.
In what is almost certainly the best-known section of the book, Winner considers whether or not artifacts have politics—answering this question strongly in the affirmative. Couching his commentary in a recognition that “Scarcely a new invention comes along that someone doesn’t proclaim it as the salvation of a free society” (20), Winner highlights that social and economic forces leave clear markers on technologies, but he notes that the process works in the opposite direction as well. Two primary ways in which “artifacts can contain political priorities” (22) are explored: firstly, situations wherein a certain artifact is designed in such a way as to settle a particular larger issue; and secondly, technologies that are designed to function within, and reinforce, a certain variety of political organization. As an example of the first variety, Winner gives an example of mechanization at a nineteenth century reaper manufacturing plant, wherein the process of mechanization was pursued not to produce higher quality or less expensive products, but for the purposes of breaking the power of the factory’s union. While an example of the second sort of politics can be seen in the case of atomic weaponry (and nuclear power) wherein the very existence of these technologies necessitates complex organizations of control and secrecy. Though, of the two arguments, Winner frames the first example as presenting clearer proof, technologies of the latter case make a significant impact insofar as they tend to make “moral reasons other than those of practical necessity appear increasingly obsolete” (36) for the political governance of technological systems.
Inquiring as to the politics of a particular technology provides a means by which to ask questions about the broader society, specifically: what kind of social order gets reified by this technology? One of freedom and equality? One of control and disenfranchisement? Or one that distracts from the maintenance of the status quo by providing the majority with a share in technological abundance? It is easy to avoid answering such questions when you are sleepwalking, and as a result, “without anyone having explicitly chosen it, dependency upon highly centralized organizations has gradually become a dominant social form” (47). That this has not been “explicitly chosen” is partially a result of the dominance of a technologically optimistic viewpoint that has held to “a conviction that all technology—whatever its size, shape, or complexion—is inherently liberating” (50). Though this bright-eyed outlook is periodically challenged by an awareness of the ways that some technologies can create or exacerbate hazards, these dangers wind up being treated largely as hurdles that will be overcome by further technological progress. When all technologies are seen as “inherently liberating” a situation arises wherein “liberation” comes to be seen only in terms of what can be technologically delivered. Thus, the challenge is to ask “What forms of technology are compatible with the kind of society we want to build?” (52) rather than simply assume that we will be content in whatever world we sleepily wander into. Rather than trust that technology will be “inherently liberating,” Winner emphasizes that it is necessary to ask what kinds of technology will be “compatible with freedom, social justice, and other key political ends” (55), and to pursue those technologies.