Have you ever found that you felt rather depressed after using social media? What about quite happy? It is probably no great stretch to imagine that the answer to at least one of those questions is “yes.”
Now, what if you realized that your emotional reaction was not the result of a normal response to the unfiltered content you were seeing, but was instead reflective of manipulation on the part of the website? What if your emotion was a result of those behind the site trying to see if they could swing your mood. If the social network you are thinking of is Facebook, than there is a chance (was a chance) that some of what you were feeling (at least in January 2012) was the result of a psychological experiment – one that you agreed to participate in by hitting “I agree” for Facebook’s terms of service.
The study consisted of Facebook (and the researchers it was working with – from Cornell and UCSF) toying with a percentage of user’s News Feed(s) – by either blocking out words with positive connotations or words with negative connotations. The not altogether stunning result being proof – backed by a psychological study – that the emotions people are exposed to through Facebook have an impact on their own emotional states. As it was put in the study (specific numbers removed):
“When positive posts were reduced in the News Feed, the percentage of positive words in people’s status updates decreased…compared with control…whereas the percentage of words that were negative increased…Conversely, when negative posts were reduced, the percent of words that were negative decreased…and the percentage of words that were positive, conversely, increased,”
In other words: the emotional states that people detected from their friends on Facebook wound up influencing the way they expressed their own emotional states on Facebook (which arguably would go on to influence other friends). Facebook is committed to giving people “the power to share and make the world more open and connected” – and while this may include a proudly touted (if self-serving) commitment to opposing government surveillance – the company evidently is not above manipulating the emotions of users.
At risk of minimizing the impact of this story it may be useful to boil it down to one simple term: creepy.
While this study certainly raises a host of ethical considerations – is agreeing to Terms of Service (ToS) really the same as giving informed consent to participate in a psychological study? – the main impact that is felt is not couched in the lingo of scientific experimentation but in the basic recognition that Facebook was actively manipulating the emotions of users. And frankly, that is rather creepy. It is also rather worrisome. The study may have officially ended but if Facebook feels that its ToS give it permission for such manipulation it makes it rather difficult to put much trust in the platform – retroactively or going forward. Despite evoking upbeat words like “share…open…connected” Facebook is also quite interested in getting you to click on advertisements – are happy or sad users more likely to click on ads? What Facebook – and its researchers – seem to have recognized is something that Bertrand Russell put simply many decades ago:
“Machines have altered our way of life, but not our instincts.” (69)
Thus Facebook recognized that these machines could be used to manipulate those instincts – with rather predictable results.
On the surface this story appears to be one about a powerful social network manipulating its users in potentially unethical ways. While such a reading is not incorrect, its emphasis on the emotional rigging and ethical failure, may serve to distract from less simple but all the more important issues. For this is not simply about manipulation – this is about a massive technology firm recognizing that with some tweaking of algorithms and programs that they can elicit desirable responses from users. Facebook has well over a billion users, many of whom (as the study demonstrates) use the social network to give voice to their emotional states – for a proprietary corporation to have the power to determine which of these states gets seen and by whom should give one and all pause. It is a stunning recognition of the amount of power these tech companies have, and even more jarring proof that these companies are curious about how they can use this power to advance their own ends. From the massive user pool, to the flow of information, to the murky terms of service – this story is inconceivable without modern technology, and should not be considered in isolation from a larger delving into technological matters.
Though Facebook is certainly the one being upbraided at the moment one should not too blithely place the blame solely upon them – it does not require much of a stretch of the imagination to picture other major tech platforms behaving in a similar fashion. The sentiment that power gives a company permission to shrug off potential protest or ethical concerns is becoming modern tech firm’s hubristic signature. The space between Facebook’s emotion study and Google Glass is smaller than it may seem as both are expressions of corporate firms deploying proprietary corporate tools simply because they can. Whether it is Amazon punishing publishers, YouTube demanding musicians fall in line, or Facebook manipulating users’ emotions – technology firms are proving that the tools for dissemination are also the tools for discrimination and control.
The dispersion of modern technologies has allowed the platforms and firms an astonishing level of access. What makes the Facebook study so galling is not just that it took place (with a reasonably small group actually being experimented upon), but the thought of how many people it could have impacted – and the fact that participants did not know. The group in the experiment consisted of 689,000 Facebook users (a sliver when one considers Facebook’s billion plus users) – but it easily could have involved far more users, and they also would not have been informed. Indeed it is difficult to truly trust the number 689,000 as it fails to take into account those users’ friends who in turn may have been emotionally influenced by an influx of positive or negative content. Furthermore, while 689,000 may be a small number in comparison to Facebook’s total user base, one should not ignore that it is still a huge number of people. People delivered to Facebook’s experiment courtesy of contemporary technology. Furthermore, who’s to say that your Facebook profile is not at this very moment sitting in the waiting room for the next experiment?
Yet it is worth keeping in mind, to reiterate, that Facebook’s emotional control experiment is not simply about Facebook, rather it is indicative of the controlling influence that technological platforms play in our daily lives. People who live in technological societies and have become accustomed to using Internet connected devices throughout the course of their day have become accustomed to mediating their experiences of the world through technology: we get directions on-line, we shop on-line, we send pictures, we read the news, we read e-books, we tweet, we watch YouTube videos, and we use social network sites like Facebook. Using these devices, sites, and platforms may give people a sense of new freedoms and access to a staggering quantity of information – but the Facebook study reveals the negative side that lurks just below the friendly veneer. The devices that we think we control have the ability to control us back – they track our movements, read our e-mails, know what sites we visit, and (as we now know) may be used to manipulate our emotions.
While a response to the Facebook experiment of “I am going to quit Facebook” – is certainly a legitimate one – it needs to be twinned with recognition of the amount of power held by major tech firms. It is not simply that these companies have mountains of money, it is that they have the ability to exert a great deal of influence and those falling under the influence may not even realize it.
The Facebook emotion experiment represents the funhouse mirror version of the NSA’s mass surveillance program – both of which entail the harnessing of pervasive and invasive technology for the maintenance of social control. Or, to put it another way: the Facebook emotion experiment is Huxley to the NSA’s Orwell – it is less that the regimes of control imagined by those authors are opposites as that they are systems that can work to strengthen each other. In his book Amusing Ourselves to Death the media theorist Neil Postman contemplated this matter, which he called “The Huxleyan Warning,” at length. Postman recognized that Orwell’s world of Big Brother sparked easier recognition and outrage than the world of the feelies, soma, and bumblepuppy, as he wryly noted that people are unlikely to rise up against that which keeps them amused (or “connected”). Technological society has been wondrously successful in providing people with a myriad assortment of “goods” (of which Facebook is certainly an example) while draining society of contemplation of “the good.” Facebook’s manipulation of users emotions bears no resemblance to “the good” but is only a cunning manifestation of the use of “the goods” to influence people into believing that they are experiencing “the good.” A social network may provide people with a space to share their feelings – but if for-profit entity is making sure that only the “correct” feelings get through than this is a farcical notion of “open” exchange.
A range of emotional responses to the Facebook experiment are justified, but surprise should not be amongst them. Indeed, the Facebook study is simply an affirmation of what Postman wrote decades ago:
“it is much later in the game now, and ignorance of the score is inexcusable. To be unaware that technology comes equipped with a program for social change, to maintain that technology is neutral, to make the assumption that technology is always a friend to culture is, at this late hour, stupidity plain and simple. Moreover, we have seen enough by now to know that technological changes in our modes of communication are even more ideology-laden than changes in our modes of transportation.” (157)
When we sit before the digital trough of Facebook’s News Feed are we choosing what we eat or are we simply being fed? Is the food on offer a balanced healthy meal or just a sugary slurry carefully concocted to keep us clicking? Do we really understand the information on the label?
That the Facebook experiment sticks in the back of our throats is just further proof that we need to rethink our technological diet.
Works Cited:
Amusing Ourselves to Death. Neil Postman (Penguin, 2005)
Sceptical Essays. Bertrand Russell (Routledge Classics, 2004)
Related Content:
Riddled With Questions – Interrogating Your Technology
Luddism for these Ludicrous Times
A Pyramid of Technological Control
Whose Vision of the Future is This?
Very interesting take. I have several thoughts and reactions in no particular order. 1) there appears to have been a universal hunger on the part of those in power and those in academia to wonder: “what are they thinking?” about the Great Unwashed (us). The rise of statistics as an alleged science underscores this, despite the uncontradicted dictum of Disraeli: there are lies, damned lies and statistics.
2) If this “study” is to masquerade as scientific, where are the results on the control group? Bet a dollar there WAS no control group.
3)Anyone who believes these activities are not transmitted to the government spooks is so naive it will be hard to have a cogent discussion.
4) That term they are bruiting about seems a new buzzword for mob-think. Was a study really needed to demonstrate that bad news is depressing and good news is not? That scare tactics in the media work, and scare people? Whether the information conveyed has any remote resemblance to truth?
5)It was a bit more difficult to manipulate thoughts and emotions on a wide band before the days of “social media”–an oxymoron for the 21st century. But yellow journalists managed; i.e., Hearst driving this nation into the war with Cuba over an alleged blowing up of an American warship and LBJ’s infamous Bay of Tonkin deception.
6) The principle finding of this study should be headlined by the most traditional term associated with computers: GIGO. Garbage in Garbage out. Input garbage to a person’s “feed” expect garbage output from a person’s feed.
7) the undiscussed absence of an “unlike” button makes any results from a FB survey so farcical as to be hard to describe. You read some post that popped up in your feed and don’t like it–if a don’t like button existed, you could condemn it to don’t like purgatory and move on,minimally intruded upon Remember the good old days of newspapers bringing our news? You could fold the front page awfuls back and go straight to comics or the sports.:equal to a don’t like button. My grandfather always shouted “knob” when something came on the radio he didn’t like: turn the knob, change the station. Maybe we should all just type KNOB when we get a post we don’t like/didn’t want in our feed. The TV remote was designed for people like my grandfather.
8) Such a manipulation was easy to intuit even before the news broke. I kept getting crap in my feed from sources never visited and certainly not liked; the stream of advertisements along the side of the page told me I was being tracked–they even tried to sell me my own books there,and boots, and Big and Tall clothing, and Zoosk hotties and hunting and fishing gear. So when I started getting repetitive posts on that weeping nutjob with his arm thrust in the air saying he wanted to disarm American because another nutjob killed his child–funded by the Bloomberg billions and his straw groups–and when I wrote in comments asking to keep this damn thing out of my face–and they kept right on sending it–it was clear they were recording results and trying to persuade the reader there was a huge ground swell of these people out there. The ridiculous statistics quoted to support terror of citizens was very reminiscent of Hearts’ yellowest days.
9) An unspoken but obvious corollary to this study was to run a packet sniffer on guys like me who refused to be moved in the direction of their Judas goat posts, for resale to the government. Because I have FB “friends” across the spectrum of politics from rampant lefties to grumpy conservatives, they probably wanted to sort us into a special category–sheep from the goats in the words of a 1950s SF story predicting this use of world wide computing.
10) The only technology neo-Ludds should be embracing is a portable EMP rifle.
(Bet that one will get me into the que!
Thanks for this post. I have now wasted more time about FB today!
Reblogged this on My Little Spacebook and commented:
Facebook’s exploits never fail to creep me out…
Ugh, no one’s surprised but we all keep using it. Unless a critical mass of my friends and contacts moves onto something else, as an expat it remains my primary way of keeping in touch.
Pingback: Are We Technologically Literate? | LibrarianShipwreck
Pingback: Bogus, but not the Bogeyman – The FCC and Net Neutrality | LibrarianShipwreck
Pingback: ¿Estamos alfabetizados tecnológicamente? | blognooficial
Pingback: Can We Have Our Cake without Soylent Goo? | LibrarianShipwreck
Pingback: Response Ability or Responsibility? | LibrarianShipwreck
Pingback: Concerned about Privacy? Consider going Postal! | LibrarianShipwreck
Pingback: Hope Comes Pre-Installed | LibrarianShipwreck
Pingback: A Reality (g)Rift | LibrarianShipwreck
I need a virtual Reality check! lol…
Pingback: A Charitable Definition of Charity | LibrarianShipwreck
Pingback: When it Comes to Social Media – Is There No Alternative? | LibrarianShipwreck
Pingback: Facebook Reactions Are No Laughing Matter | LibrarianShipwreck
Pingback: You Can’t Spell Trending Topics Without Control | LibrarianShipwreck
Pingback: Who moderates the moderators? On the Facebook Files | LibrarianShipwreck
Pingback: The Sorcerer’s Apprentice 2.0 | LibrarianShipwreck
Pingback: Facebook – to delete, or not to delete? | LibrarianShipwreck
Pingback: Facebook: ¿Borrarse o no borrarse? | No oficial