"More than machinery, we need humanity."
Have you ever found that you felt rather depressed after using social media? What about quite happy? It is probably no great stretch to imagine that the answer to at least one of those questions is “yes.”
Now, what if you realized that your emotional reaction was not the result of a normal response to the unfiltered content you were seeing, but was instead reflective of manipulation on the part of the website? What if your emotion was a result of those behind the site trying to see if they could swing your mood. If the social network you are thinking of is Facebook, than there is a chance (was a chance) that some of what you were feeling (at least in January 2012) was the result of a psychological experiment – one that you agreed to participate in by hitting “I agree” for Facebook’s terms of service.
The study consisted of Facebook (and the researchers it was working with – from Cornell and UCSF) toying with a percentage of user’s News Feed(s) – by either blocking out words with positive connotations or words with negative connotations. The not altogether stunning result being proof – backed by a psychological study – that the emotions people are exposed to through Facebook have an impact on their own emotional states. As it was put in the study (specific numbers removed):
“When positive posts were reduced in the News Feed, the percentage of positive words in people’s status updates decreased…compared with control…whereas the percentage of words that were negative increased…Conversely, when negative posts were reduced, the percent of words that were negative decreased…and the percentage of words that were positive, conversely, increased,”
In other words: the emotional states that people detected from their friends on Facebook wound up influencing the way they expressed their own emotional states on Facebook (which arguably would go on to influence other friends). Facebook is committed to giving people “the power to share and make the world more open and connected” – and while this may include a proudly touted (if self-serving) commitment to opposing government surveillance – the company evidently is not above manipulating the emotions of users.
At risk of minimizing the impact of this story it may be useful to boil it down to one simple term: creepy.
While this study certainly raises a host of ethical considerations – is agreeing to Terms of Service (ToS) really the same as giving informed consent to participate in a psychological study? – the main impact that is felt is not couched in the lingo of scientific experimentation but in the basic recognition that Facebook was actively manipulating the emotions of users. And frankly, that is rather creepy. It is also rather worrisome. The study may have officially ended but if Facebook feels that its ToS give it permission for such manipulation it makes it rather difficult to put much trust in the platform – retroactively or going forward. Despite evoking upbeat words like “share…open…connected” Facebook is also quite interested in getting you to click on advertisements – are happy or sad users more likely to click on ads? What Facebook – and its researchers – seem to have recognized is something that Bertrand Russell put simply many decades ago:
“Machines have altered our way of life, but not our instincts.” (69)
Thus Facebook recognized that these machines could be used to manipulate those instincts – with rather predictable results.
On the surface this story appears to be one about a powerful social network manipulating its users in potentially unethical ways. While such a reading is not incorrect, its emphasis on the emotional rigging and ethical failure, may serve to distract from less simple but all the more important issues. For this is not simply about manipulation – this is about a massive technology firm recognizing that with some tweaking of algorithms and programs that they can elicit desirable responses from users. Facebook has well over a billion users, many of whom (as the study demonstrates) use the social network to give voice to their emotional states – for a proprietary corporation to have the power to determine which of these states gets seen and by whom should give one and all pause. It is a stunning recognition of the amount of power these tech companies have, and even more jarring proof that these companies are curious about how they can use this power to advance their own ends. From the massive user pool, to the flow of information, to the murky terms of service – this story is inconceivable without modern technology, and should not be considered in isolation from a larger delving into technological matters.
Though Facebook is certainly the one being upbraided at the moment one should not too blithely place the blame solely upon them – it does not require much of a stretch of the imagination to picture other major tech platforms behaving in a similar fashion. The sentiment that power gives a company permission to shrug off potential protest or ethical concerns is becoming modern tech firm’s hubristic signature. The space between Facebook’s emotion study and Google Glass is smaller than it may seem as both are expressions of corporate firms deploying proprietary corporate tools simply because they can. Whether it is Amazon punishing publishers, YouTube demanding musicians fall in line, or Facebook manipulating users’ emotions – technology firms are proving that the tools for dissemination are also the tools for discrimination and control.
The dispersion of modern technologies has allowed the platforms and firms an astonishing level of access. What makes the Facebook study so galling is not just that it took place (with a reasonably small group actually being experimented upon), but the thought of how many people it could have impacted – and the fact that participants did not know. The group in the experiment consisted of 689,000 Facebook users (a sliver when one considers Facebook’s billion plus users) – but it easily could have involved far more users, and they also would not have been informed. Indeed it is difficult to truly trust the number 689,000 as it fails to take into account those users’ friends who in turn may have been emotionally influenced by an influx of positive or negative content. Furthermore, while 689,000 may be a small number in comparison to Facebook’s total user base, one should not ignore that it is still a huge number of people. People delivered to Facebook’s experiment courtesy of contemporary technology. Furthermore, who’s to say that your Facebook profile is not at this very moment sitting in the waiting room for the next experiment?
Yet it is worth keeping in mind, to reiterate, that Facebook’s emotional control experiment is not simply about Facebook, rather it is indicative of the controlling influence that technological platforms play in our daily lives. People who live in technological societies and have become accustomed to using Internet connected devices throughout the course of their day have become accustomed to mediating their experiences of the world through technology: we get directions on-line, we shop on-line, we send pictures, we read the news, we read e-books, we tweet, we watch YouTube videos, and we use social network sites like Facebook. Using these devices, sites, and platforms may give people a sense of new freedoms and access to a staggering quantity of information – but the Facebook study reveals the negative side that lurks just below the friendly veneer. The devices that we think we control have the ability to control us back – they track our movements, read our e-mails, know what sites we visit, and (as we now know) may be used to manipulate our emotions.
While a response to the Facebook experiment of “I am going to quit Facebook” – is certainly a legitimate one – it needs to be twinned with recognition of the amount of power held by major tech firms. It is not simply that these companies have mountains of money, it is that they have the ability to exert a great deal of influence and those falling under the influence may not even realize it.
The Facebook emotion experiment represents the funhouse mirror version of the NSA’s mass surveillance program – both of which entail the harnessing of pervasive and invasive technology for the maintenance of social control. Or, to put it another way: the Facebook emotion experiment is Huxley to the NSA’s Orwell – it is less that the regimes of control imagined by those authors are opposites as that they are systems that can work to strengthen each other. In his book Amusing Ourselves to Death the media theorist Neil Postman contemplated this matter, which he called “The Huxleyan Warning,” at length. Postman recognized that Orwell’s world of Big Brother sparked easier recognition and outrage than the world of the feelies, soma, and bumblepuppy, as he wryly noted that people are unlikely to rise up against that which keeps them amused (or “connected”). Technological society has been wondrously successful in providing people with a myriad assortment of “goods” (of which Facebook is certainly an example) while draining society of contemplation of “the good.” Facebook’s manipulation of users emotions bears no resemblance to “the good” but is only a cunning manifestation of the use of “the goods” to influence people into believing that they are experiencing “the good.” A social network may provide people with a space to share their feelings – but if for-profit entity is making sure that only the “correct” feelings get through than this is a farcical notion of “open” exchange.
A range of emotional responses to the Facebook experiment are justified, but surprise should not be amongst them. Indeed, the Facebook study is simply an affirmation of what Postman wrote decades ago:
“it is much later in the game now, and ignorance of the score is inexcusable. To be unaware that technology comes equipped with a program for social change, to maintain that technology is neutral, to make the assumption that technology is always a friend to culture is, at this late hour, stupidity plain and simple. Moreover, we have seen enough by now to know that technological changes in our modes of communication are even more ideology-laden than changes in our modes of transportation.” (157)
When we sit before the digital trough of Facebook’s News Feed are we choosing what we eat or are we simply being fed? Is the food on offer a balanced healthy meal or just a sugary slurry carefully concocted to keep us clicking? Do we really understand the information on the label?
That the Facebook experiment sticks in the back of our throats is just further proof that we need to rethink our technological diet.
Amusing Ourselves to Death. Neil Postman (Penguin, 2005)
Sceptical Essays. Bertrand Russell (Routledge Classics, 2004)