"More than machinery, we need humanity."
Here is a thought experiment: reflect back on the last forty-eight hours of your life, how often did you find yourself relying on computerized technology for performing a function that you (or at least another human) used to perform? Did you use GPS instead of pulling out a map, did you sit on a plane captained by autopilot, did you use spell-check instead of pulling out a dictionary, did you send an e-mail instead of writing a letter…the list could certainly go on. Now, consider the last forty-eight hours again, and ask yourself how many of the tasks that were still performed by an actual human may be taken over by computerized technology in the years ahead. Do not take for granted whether such reflection fills you with excitement or if it provokes a certain anxious feeling.
A great deal has already been overtaken by automation, and it seems likely that more will be automated in the future, but that does not automatically mean that this is altogether positive.
While automation is hardly a new process it does seem to be reaching ever more pervasive heights and in so doing has taken on a certain ideological quality. We are repeatedly assured that – despite some temporary discomfort – automation will improve our lives and usher in a technological utopia. Indeed, we are also repeatedly assured, automation has already been improving our lives for years. And yet such improvements may represent more of a tradeoff than we were originally told – hard won skills may evaporate, the sense of fulfillment from a complex task may diminish, and in relying ever more on computers we may find ourselves losing sight of what it means to be human. These concerns are at the core of Nicholas Carr’s book The Glass Cage: Automation and Us, a text which acts as a compelling reminder that critical thinking is a task that has not been automated – not as of yet, at least.
The world upon which Carr looks is one that is in the throes of change yet is also going through the continuation of a lengthy process – there is a through line that connects the replacement of “manual” gear-shifting to “automatic” to the self-driving car. There is nothing particularly new about humans shifting more tasks, and responsibilities, to various technologies, even if the growing ubiquity of computer technology has allowed for this shift to take place in ever more sectors. The advocates of automation describe the process as being one that offers a form of liberation – it frees us from mundane tasks and unpleasant labor, giving us ever-greater opportunities for personal reflection and self-fulfillment. Though it seems that much of the “free time” automation has given us, we wind up frittering away on idle technological pursuits. Technology may give us more time for fulfilling activities but it also provides us with ample ways of filling that time – from non-stop streaming video, to constant games, to a steady flow of tweets, texts and social media updates. Automation provides us with time to enjoy bread and circuses – it even makes it easier for us to have the bread delivered and to watch the circus.
Much of the concern, historically and currently, surrounding automation is related to the way that it impinges upon labor – directly threatening the ability of some workers to continue earning a livelihood. Indeed, as Carr reminds the reader, the group whose name has come to be (unfairly and inaccurately) linked to knee-jerk opposition to technology – the Luddites – were actually a group of skilled laborers witnessing their craft being eviscerated by the early forces of industrialization and mechanization (the precursor to automation). While there have been few attempts at resistance to automation as iconic as the Luddite risings – the concerns about displacement wrought by automation have not diminished – if anything the increases in technological sophistication have meant that ever more areas are now potential victims of the machine. Alas, even a so-called “white collar job” is not necessarily a safe career field anymore as pilots, professors, doctors and lawyers all see their livelihoods becoming more precarious. Likewise the core skills that had once been so important for some of these fields are gradually diminished or turned over to computer programs. As Carr glumly observes:
“When automation reaches its highest level, when it takes command of the job, the worker, skillwise, has nowhere to go but down.” (113)
The above is a point which should be connected with an observation Carr makes a few pages later, namely that:
“A single computer can take over the work of dozens of well-paid professionals.” (116)
The matter of what automation does to hard won human skills is a topic that Carr devotes particular attention to – not simply because people are losing their jobs, but because in some instances such deskilling can result in the loss of life. While the words “on autopilot” may refer to something quite particular in regards to aviation, such terms also capture much of the experience of automation wherein people put ever more responsibility and trust into computerized systems. But when the autopilot fails? When a person must take over? The complacent comfort encouraged by automation can make it perilous for a person to reassume control – particularly as an overreliance on automation may have resulted in the essential skill set (built up and honed by practice) having steadily withered away. When a person must take control of the steering wheel once more – as Carr demonstrates by discussing various aviation tragedies (and one example of impressive human skill) – if they are not prepared to do so the results can prove fatal.
While airplanes may seem a particularly stark example of this, the energy currently being devoted towards “self-driving” cars suggests this topic is one worthy of serious reflection. If the person “behind the wheel” has been turned into just another passenger by automation – what happens when they suddenly need to take the wheel? While it is true that “human error” is often treated as the culprit, it should not be forgotten that automation – through deskilling and an overreliance upon the computer – may accidentally encourage such errors. Is not the initial and more dire “human error” the one that privileged the computer to the detriment of the human? Yet the lack of human omniscience is enough to make some suggest that things would be better – or at least safer – if we simply turned over more control to automation: self-driving cars, autopilot from takeoff to landing, and even the possibility of a battlefield filled with automatons.
One thing that becomes easily lost in the mania for ever more automation are questions regarding the underlying ethics and ideologies that are pushing for more automation. That the technology exists to do something – to automate a given task – does not immediately mean that it should be done. At present it seems that the machine has been celebrated to such an extent that humans are fading into the background. The need, often market driven, to open up ever more areas to automation seems to occur regardless of whether or not such automation truly fits with human needs or genuine human desires. After all, as Carr discusses, people may report frustration and annoyance with having to work, but they also derive a sense of satisfaction and accomplishment from their labor. Automation granted idleness may at first sound pleasant – but if this idleness just gives people more time to watch screens it may not stimulate much satisfaction. Carr takes care to emphasize the many benefits that automation can bring, but he also reminds the reader that:
“once the technology becomes embedded in physical infrastructure, commercial and economic arrangements, and personal and political norms and expectations, changing it becomes enormously difficult. The technology is at that point an integral component of the social status quo. Having amassed great inertial force, it continues down the path it’s on.” (172)
Automation seems to beget automation, and it easily becomes a force that trundles forward thanks to its own social, political and economic momentum. The more control we cede to the computerized technology of automation the more difficult it may be to wrest control back. It is not simply that automation has some particularly vocal advocates, but that those advocates are hardly disinterested parties. Advances in automation are heavily driven by large corporations that see in this process a way of deriving a very real financial benefit and of increasing the power and prestige they already enjoy. And while automation seems to hold out a promise of a leisurely society where our machines have freed us to pursue more enjoyable pursuits – insofar as automation remains bound up in our present economic and political situation we find that the benefits of automation primarily accrue upwards while precariousness flows downwards. If automation is being privileged perhaps it is time to alter our priorities, as Carr writes:
“To ensure society’s well-being in the future, we may need to place limits on automation. We may have to shift our view of progress, putting the emphasis on social and personal flourishing rather than technological advancement. We may even have to entertain an idea that’s come to be considered unthinkable, at least in business circles: giving people precedence over machines.” (228)
The world around us is shaped by the technologies we use – and in shaping the world humanity has come to shape itself. While a new technology can be a powerful extension of our abilities it can also warp them and render us frail in the face of that which we have made. Automation can answer many questions and solve many problems for us – but one question, and one problem, for which it cannot provide an easy answer – is one Carr poses early in The Glass Cage:
“What does human being mean?” (18)
And it may well be that this is the question we need to be asking ourselves most.
* * *
At times it can feel that much of the discourse around technology has fallen victim to automation: one encounters similar arguments rehashed in similar forums with similar conclusions featuring a similar lack of critical engagement. Yet amongst the songs of praise to all things mechanical one is increasingly able to find works willing to question whether the “good news” is really so good. Nicholas Carr’s The Glass Cage is such a book, one that intervenes in contemporary conversations about technology in a way that actually seeks to push the discussion in a productive direction. It may not be wholly accurate to say that The Glass Cage prods the discussion in a “new” direction – but he is certainly pushing things in a direction that has largely been forgotten and overlooked.
It is quite likely that Carr’s book will be shrugged off as “Luddite grumbling” by those who cannot stomach the slightest critique of technology, but as Carr continually reminds the reader he is hardly anti-technology. From his reminiscence of the pleasure he derived from driving a car with a manual transmission to his reflections on playing video games through his thoughts on the activity of mowing – it is quite clear that Carr is no opponent of technology. Carr’s (historically accurate!) evocations of the Luddites may lead some to believe that Carr is calling for a fresh round of machine-breaking, but the point Carr is making is that the unthinking embrace of the machines has led to the machines breaking down much of what it means to be human. Indeed, The Glass Cage is not a critique of technology as such, but a critique of technology as too much.
Particularly noteworthy is Carr’s willingness to use the terminology of ethics in his discussions of technology, as he writes:
“The choices we make, or fail to make, about which tasks we hand off to computers and which we keep for ourselves are not just practical or economic choices. They’re ethical choices. They shape the substance of our lives and the place we make for ourselves in the world.” (18)
And yet lurking in the shadows behind this magnificently confrontational declaration is the question of “choices.” Much of The Glass Cage consists of stories and arguments that discuss the way in which automation (and technology in general) has altered not only how we make choices but even the choices that are available to us. As we find our society increasingly configured around the will to automation we find our ability to choose rather hemmed in. A choice in favor of automation is made – often not by the individual most impacted by it – and this winds up shaping a whole variety of other choices. Far from providing a plethora of new options automation often deprives us of the ability to choose – in the name of increasing freedom it selects a particularly narrow definition of freedom and then demands compliance. One of the things that seems to have been automated is our ability to choose – we can pick A or B, but both of those are options on the same computer screen.
Yet this is still not to satisfactorily identify the “we” in “choices we make.” The tech company Google (and some of its prominent employees) appears as a recurring character in The Glass Cage as Carr discusses things ranging from self-driving cars to Google Glass to Google Maps. Clearly Google is a powerful “we” when it comes to “choices we make,” but for most of us these are really the “choices they make” not the “choices we make.” This is an important difference to consider in the context of Carr’s comment (quoted earlier) about the way that a technology, once it becomes a powerful force in society, is hard to alter. What this reveals is that there is something fallacious about arguments that state “well, if you don’t like the choices [big company X] makes, don’t use their products” – for insofar as the large firms that can make “choices” have a large impact on the wider society an individual can only do so much to avoid the fallout of those selections. Put another way, many people do not approve of the choices that led to Google Glass – but that does not prevent their photo from being taken by somebody wearing Google Glass. This is not too overly target Google (they are being used as an example because Carr writes about them at several points), but it is to make the point that when we think about “choices we make, or fail to make” it may come at a point at which many of the most important choices have already been made. Thus we can see that these are “ethical choices” but we need to bear in mind that they are “ethical choices” that have been crushed in the cogs of “practical” and “economic choices.” It is not simply about “choices we make” but about seeing how the “choices they make” shape the “choices” available.
Recognizing this power relation is essential, particularly in the light of comments from Carr such as:
“Google and other software companies are, of course, in the business of making our lives easier. That’s what we ask them to do, and it’s why we’re devoted to them.” (80)
It is worth looking askance at the above lines. After all “Google and other software companies” are not “in the business of making our lives easier” – rather they are in the business of staying in business, also known as “making money.” This is not to try to make an unfair point – but simply to indicate that these are for-profit companies in capitalist societies. When there is money to be made in “making our lives easier” these companies will do so and when there is money to be made in selling our private information to advertisers these companies will do so and if there is money to be made in making our lives more complicated (see: planned obsolescence) they will do so. Likewise it’s tricky to claim “that’s what we ask them to do” – since this suggest a level of agency in an initial choice that might not truly have been there (and again “making our lives easier” is a hard thing to fully parse out) – it may be that we are asking for the technological means that have become essential for achieving contemporary ends, but this is not necessarily “easier.”
When “making our lives easier” is the default option – or the marketing spiel – there’s not much choice really involved. As for “it’s why we’re devoted to them” – might we not be “devoted” because they have all of our stuff and we do not want to lose access to it? The intention is not here to eviscerate The Glass Cage on the basis of two sentences – to restate, it is a considerately crafted book that is certainly a worthwhile read – but one of the main weaknesses of Carr’s text is that it turns a critical gaze towards technology but takes a somewhat less critical stance when looking at the larger society which produced these technologies. Granted, much of this criticism is blunted by comments from Carr along the lines of:
“If we don’t understand the commercial, political, intellectual, and ethical motivations of the people writing our software, or the limitation inherent in automated data processing, we open ourselves to manipulation.” (208)
Nevertheless, as The Glass Cage makes consistently and abundantly clear – we can “understand the commercial, political, intellectual, and ethical motivations” and still find ourselves being manipulated.
The Glass Cage is a brisk 232 pages, it is a quick, enlightening and enjoyable read – and yet in some ways it leaves the reader feeling as if they have only read two thirds of a book. A very good two thirds of a book – but it still leaves the reader wondering where the next hundred pages have gone. Carr builds up a strong argument about the dangers – and potential benefits – of automation but the book could have benefited from a turn to a more robust analysis of technological society. At multiple junctures in The Glass Cage the reader comes across statements such as:
“At some point, automation reaches a critical mass. It begins to shape society’s norms, assumptions and ethics.” (193)
But the problem is that if this is true (and Carr convincingly demonstrates that it is true) than it seems that much more is going on than simply the influence of automation. Here it would have been worthwhile for Carr to turn to broader technological critiques such as Lewis Mumford’s “the megamachine” or Jacques Ellul’s “technique” – both of which locate automation in a larger set of social/political/economic/ethical relations that treat automation as just one of the more visible processes in a larger technological onslaught. This is not to say that Carr must agree with Mumford or Ellul (or to suggest that he does agree) – but entering into theoretical conversation with such thinkers would have allowed Carr to better hash out his ideas about automation as they function in a broader technological context. Furthermore concepts such as “the megamachine” and “technique” would help Carr explain some of the problems around the role of engineers, scientists and other tech employees – for Ellul and Mumford both portray such workers as ultimately being in thrall to the technological systems they think they control. Whilst the broad appeal of automation might be further explained by considering a concept such as Mumford’s idea of the “megatechnic bribe.” Applying the lens of Ellul or Mumford to Carr’s argument does not disprove anything Carr is writing – but it raises the question of whether Carr’s argument goes far enough. And such thinkers might help push the arguments in The Glass Cage further.
With a slightly woebegone tone Carr hopes that automation will be brought under a more human form of control – but what Ellul or Mumford contribute is recognition that it is not just automation that must be confronted but the entirety of our socio-technological apparatus. Carr writes with laudable conviction about the need for people to be involved in the decisions about the shape of technology, for them to be active, not passive, in making choices – but what concepts such as “the megamachine” and “technique” help explain, and demonstrate, are the very reasons and ways in which individuals have been frozen out of these discussions. To borrow another concept from Mumford – an “authoritarian technic” cannot be harnessed as if is a “democratic technic” though those who benefit from “authoritarian technics” are certainly happy for people to think the contrary is true. Or, to use a company Carr regularly cites, the problem may not be that Google is making advances in automation – the problem may be companies like Google, full stop. It may be that we are not caught in a glass cage, but that we are still in the “iron cage” of which Max Weber wrote.
What makes it particularly unfortunate that Carr does not situate automation in a much larger, and perhaps harsher, critique (granted, he has written other books) is that the final chapter of the book is a wonderful gesture towards such a critique. With a backwards glance to a poem by Robert Frost, Carr truly turns to the question that he has been toying with over the course of the book: how the human use of technologies changes the human using those tools. What Carr discusses, with a sort of humble recognition, is that some tools can provide people with a fuller experience of the world, that some tools provide for an extension of power without a diminishment of responsibility. The tool that so sparks Carr’s revelry? It is the one that appears in Frost’s sonnet “Mowing” and the tool is the scythe. A technology that greatly improves human capability but which still requires skill and patience to use effectively– a tool which forces a human to be there in the moment. In discussing the scythe Carr consistently refers to it as “a congenial tool” (218) – and it may be a coincidence but this seems more than slightly reminiscent of what Ivan Illich referred to as “convivial tools.” Indeed, the conclusion that Carr comes to regarding the benefits of small scale and simpler technologies is one with a lengthy pedigree – from William Morris to Peter Kropotkin and from Ivan Illich to E.F. Schumacher. Indeed, there is quite a bit in Carr’s conclusion that gestures towards the problem that Mumford was always wrestling with: the opposition between the “goods life” and the “good life.”
A risk, of which Carr seems quite cognizant, is that such reflections on a more idyllic relationship to technology may come off as sentimentality or silly romanticism. Yet the point, subtle though it may be, is that the scythe and the robotic mower are more than simply technological artifacts – they are ethical statements on life and on the broader society in which these technologies are used. What Carr calls for in his conclusion is that the reader shatter the propaganda skillfully disseminated by the advocates of automation in order to truly consider the technologies around them. To look beyond the promises of “a life made easier” and to instead consider whether or not that life is one filled with meaning or just filled with idle technological diversions. A glass cage, as it turns out, can be quite comfortable – but we should have the moral fortitude and intellectual bravery to recognize that it is not a room with windows, but a cage. Granted – and here we return to Mumford and Ellul’s critique – it is perilous to think that one can escape from the cage whilst leaving the rest of society still imprisoned, for this cage threatens to grow and encompass the entire globe. Yet, as Carr writes (with a nod towards Star Trek):
“Resistance is never futile…our highest obligation is to resist any force, whether institutional or commercial or technological, that would enfeeble or enervate the soul.” (232)
The Glass Cage is a readable, illuminating and ethically rich book – one that draws its value not simply from examples of the woes and would-be-wonders of automation but from demanding that readers think about these topics not as boring technical matters but as vital issues shaping the way we live and who we are. Aimed at a broad audience The Glass Cage offers a nice introductory foray into important issues that hopefully will convince readers to explore these ethical matters in more detail (Albert Borgmann’s Technology and the Character of Contemporary Life being an excellent next step). At its core The Glass Cage is not really a book about automation, it is a book about what it means to be human, and what happens when we turn our ethics on autopilot and trust that somebody else will design an app that will provide us with the good life.
Yet, as Nicholas Carr makes clear, we are being confronted with a choice between the scythe, skillfully handled, and the reaper drone. While, when it comes to technology, we may not truly have the power to pick one and reject the other, one choice that remains squarely with each and every individual is to ask ourselves what it means to be human in these technological times. To ask whether our tools are helping or hindering this pursuit. To recognize in the end, as Charlie Chaplin claimed:
“more than machinery, we need humanity.”
The Book Reviewed:
by Nicholas Carr
W.W. Norton & Company, 2014.
More Book Reviews