"More than machinery, we need humanity."
When engaging in a contemporary conversation about human rights it is inevitable that the matter of technology will be brought up. The interesting thing is not that this issue arises, but rather the way in which the technological aspect is treated in the discussion. The world today has undergone many overt and subtle reconfigurations by technological systems (see: The Internet) and by those who build, advocate for, and profit from those systems. It should not be controversial to claim that issues surrounding technology are shifting the discourse around questions such as human rights. Yet, such certainty only serves to raise a potentially more controversial matter: when discussing technology and human rights, to what extent does one need to at least entertain the idea that technology may not be the doctor, it may be the disease.
Or, to put it in slightly less dire, but no less serious, terms courtesy of Lewis Mumford (Technics and Civilization):
“Using the machine alone to escape from the machine, our mechanized populations have jumped from a hot frying pan into a hotter fire. The shock-absorbers are of the same order as the environment itself.” (Mumford, 316)
The argument in favor of technology as a shock absorber is on full display at this year’s RightsCon being held from March 3-5 (2014) in San Francisco. The conference brings together a range of people from the private sector and the activist/civil society/non-profit sector to engage in discussions as to how technology can help those engaged in human rights work be more successful in advancing their causes. While numerically speaking the conference is not particularly large the list of speakers and sponsors (which include Google, Facebook and Twitter along with various NGOs) indicates that this is nevertheless a site where people are seriously engaging with the issues of technology and human rights. Conferences such as RightsCon, and the discussions that arise there, are important at historical moments such as this. Yet there is a certain degree of irony to a conference on technology and human rights, with sponsorship by major Silicon Valley companies, occurring in San Francisco at a moment when many of that city’s residents seem to have stopped buying the “good news” preached by the tech industry.
When thinking about human rights and the world today it is easy to view technology from a somewhat disinterested position that cloaks the technical apparatus in a façade of neutrality. The result is that the discussion of technology and human rights becomes less about “technology and human rights” and shifts towards more comfortable technical territory such as “net neutrality” or “the open Internet” or “the web we want” or “cheaper devices.” The problem is not these shifts in and of themselves, the problem is that when confronting technological questions if one is just looking for technological answers one risks ignoring the degree to which technology (and its advocates) have altered the questions so that the answers (“more technology!”) seem to follow naturally. In the terminology of logic and philosophy this is the literal definition of “Begging the Question” – where the conclusion is implicit in the initial premises (“technology can solve all of our problems, so the way we will solve our problems is with more technology”).
After all, when we discuss “net neutrality” do we not need to recognize that the vast majority of people reach the Internet through closed proprietary software used on machines built by huge multi-national companies under (often) questionable labor conditions? Does praising Net Neutrality not also require assuming an adversarial position towards the increasing consolidation of online control by firms like Google and Facebook, firms that just happen to be sponsors of RightsCon? When we talk about human rights and technology do we not need to consider the materiality of technology that finds discarded technology leeching toxic chemicals into landfills in the developing world? The point is not to be purely argumentative, but to emphasize that when thinking about issues such as human rights the ethically grounded stances of those advocating for rights (like Free Speech) do not necessarily mesh neatly with the profit driven ethos of major technology firms. There is a difference in being focused on issues of the relationships of publics and being a company with a public relations department.
Perhaps the clearest example of this problem at RightsCon (based on studying the event’s schedule) has to do with the issue of surveillance. After all, if there has been a defining issue in the world of technology and rights in the last year it may very well have been this issue of privacy and surveillance – seeing as the prevalence of surveillance was made (and continues to be made) disturbingly clear thanks to the revelations about the NSA made by Edward Snowden. Surveillance is a real issue, and it is one that has important international implications. While many activists and journalists in the US may chafe at the very idea of surveillance it is important to bear in mind the degree to which surveillance and control are currently and actively being used for repressive purposes in many parts of the world. It is understandable that the tech industry would want to engage with these issues, but it is important to remember the extent to which that very industry is culpable and has been implicated in these various issues. Thus when the tech industry calls on the government to reform its surveillance tactics, a justified question should be: if you firms care so much why did you wait to push back until your complicity was revealed?
This in turn brings us back to the question of technology, for most of the surveillance that is being discussed at RightsCon is the variety of surveillance that has been greatly enhanced by technology. The smartphone is a tool that many see as having fantastic and transformative potential, yet this last year has also provided us with plenty of evidence that the smartphone is also as efficient as a battalion of secret police officers being ordered to track and monitor a human rights activist. Likewise, those who are concerned by the prospect of an unaccountable government compiling massive dossiers on the activities of activists and journalists should probably also be concerned when a technology company (be it Google, Facebook, Yahoo, Amazon, etc…) does the same – especially as what the tech company has gathered may wind up transferred to a governmental entity with aims other than profit. It may be, to return to Mumford, that:
“the shock-absorber prepares one for a fresh shock.” (Mumford, 316)
This issue of “shock-absorber[s]” and “fresh shock[s]” should be central to the discussion about human rights and technology, as technology seeps into ever more aspects of our everyday lives (“the Internet of Things,” “smart homes,” self-driving cars, Google Glass) the impetus is upon advocates of human rights to recognize before these items become widely disseminated that they pose great potential risks. Frankly, that technology companies have continued unveiling such new devices with an attitude of “we’ll fix the surveillance issue in the next update” is not a comedy but a tragedy. This is not to say that some people in the tech world do not care about these issues, nor is it to suggest that they should be barred from offering solutions; however, it is to suggest that they may not be the most honest participants in this discussion. For all its talk of “making the world open” and for all their talk of “don’t be evil” do we truly expect Facebook or Google to risk losing a sizable amount of profit over human rights issue (even if they could afford it)? Consider the difficulty of getting your information out of one of these systems once it goes in, consider the amount of time these companies hold information for, consider the number of lobbyists these companies employ to defend their financial interests, consider these companies routine changing of “Terms of Service” agreements such that few people genuinely know what they have agreed to or how it has changed – these companies’ interests in people being “free” and “open” is not necessarily synonymous with human rights groups interests in people being “free” and “open.” By the same token, isn’t it also something of a human rights issue when extremely wealthy multi-national corporations dodge taxes?
The challenge for RightsCon, the challenge to RightsCon, and the challenge for those who see technology as intertwined with human rights is to recognize that technology is not independent of our ethical concerns it is an aspect of our ethical concerns. As Paul Goodman emphasized:
“Whether or not it draws on new scientific research, technology is a branch of moral philosophy, not of science. It aims at prudent goods for the commonweal, to provide efficient means for these goods. At present, however, ‘scientific technology’ occupies a bastard position, in the universities, in funding, and in the public mind. It is half tied to the theoretical sciences and half treated as mere know-how for political and commercial purposes. It has no principles of its own.” (Goodman, 40)
Alas, since the time Goodman wrote those words (1969) it is easy to argue that technology has seemingly discovered “principles of its own” (or had principles imposed upon it) and those principles have often been the standard capitalist principles of “profit” followed by “more profit” followed by “wait, could we have some more profit?” Engaging with issues of human rights, though important and though some people at some of these organizations really may care about these issues is not to overcome the profit motive of these firms. Yet, what is at display at RightsCon seems to be a recognition that these “principles” that are guiding so much of the tech industry today do not necessarily need to be the “principles” of technology. But to consider the possibilities of technology for genuinely advancing human rights issues requires recognizing the many ways in which the current set-up of technological power frames these issues in a manner that is guaranteed not to make matters too problematic for the tech world.
To reiterate, at a technological moment such as the current one, events like RightsCon are important forums, but to have a real debate around technology and human rights requires asking some of the uncomfortable and unpopular questions about technology – such as the blasphemous one: do we really need more machines? As well as a willingness to recognize that the technology that emerges in a society is not an aberration of but a reflection of that society – unequal societies rife with socio-economic disparity and consumer driven capitalism will produce technologies that reflect those values.
Every technology represents a political and ethical framework, and these tend to be the frameworks that are developed in them and advanced through them by the firms who make and market these devices. Recent technological entrants like the Fairphone and the Blackphone (without advocating for either) both serve as demonstrations of the ways that political and ethical issues can be directly addressed in technical design. It is not that we need to talk about how technology can advance human rights values, it is that we need to see that technology is a human rights value. For technology reconfigures our world, alters our socio-political and ethical relation to the world, and changes the debate about human rights. From the moment the metals are mined, through the process of programming, to the moment of assembly, to the items use, to its eventual disposal – this is a technological chain shot through with human moments, and therefore with issues relevant to human rights. When considering human rights and technology the rights of the miner or recycler are every bit as important as the rights of the technology user – indeed these issues are linked inextricably, and discourse about human rights and technology needs to make this clear.
For a conversation about technology and human rights to be more than a publicity stunt for techno-utopians (“tech-washing”?) and their employers the issue that needs to be fore grounded is not technology but the ethical implications of technology. If you start with the assumption that “technology will cure the disease” than you may overlook the fact that technology is causing the disease to mutate and spread faster than the doctors can keep up with it. The challenge for RightsCon and for those who genuinely care about questions of human rights and technology is to remember, to return to Goodman, that:
“without moral philosophy, people have nothing but sentiments.” (Goodman, 44)
RightsCon may have the right sentiment, but without moral philosophy it is just a tech conference in a city that is steadily growing disenchanted with the tech industry. Granted, addressing these issues might be very expensive and uncomfortable for tech companies, and it’s much less expensive to help sponsor a conference.
Goodman, Paul. New Reformation. PM Press, 2010.
Mumford, Lewis. Technics and Civilization. Chicago University Press, 2010.