"More than machinery, we need humanity."
Attempting to understand trends can easily give one an unpleasant headache. From fashion to music and from food to news articles – the reasons why a particular thing is trendy at any particular moment can be stubbornly opaque. Yet, at least when it comes to topics that are presented as “trending” on social media the reasoning seems clearer. Such trending material is not a matter of the obscure and mysterious behaviors of mythical “taste-makers” but a reflection of the logical operation of a seemingly objective algorithm. Right! Right? Well…evidently not.
Facebook has recently become rather red in the face as a result of an exposé, of sorts, that showed that there are real humans (not an algorithm) responsible for picking what appears on its “trending” timeline. Granted, the real controversy was not so much the fact that humans were involved as it was an anonymous source’s revelation that members of the news curating team may allow their personal politics to influence what makes it into the Facebook’s trending box. With the accusation being that Facebook, by way of its curators, has been suppressing trending topics of a conservative bent. Obviously the story was not something Facebook could simply shrug off and thus, amidst threats of a Senate inquiry, Zuckerberg himself agreed to meet with various prominent conservatives to allay their concerns. Though still others have countered that the problem in the news curating team are not about political bias, but about the fact that it is a “toxic work experience” defined by sexism and exploitative labor practices. Yet, regardless of what exactly is transpiring behind the scenes at Facebook, these stories provide further confirmation of Erich Fromm’s observation that:
“There is also no strength in use and manipulation of objects; what we use is not ours simply because we use it.”
Just because a person is using Facebook it does not mean that Facebook is theirs in any meaningful way. Indeed, Fromm’s wry comment is a useful retort to keep in mind when thinking about the myriad devices and services that fill technological societies. Though the degree of this “not ours” often only becomes clear when there is some sort of breakdown that reveals the proverbial person lurking behind the curtain. However, such rare moments of revealed reality are swiftly either forgotten or patched over with assurances that “it’ll never happen again” (hence Zuckerberg’s mea culpa) – and thus users are encouraged to return to a state in which they look upon services like Facebook as basically benevolent. And therefore, inasmuch as these moments have value it can be found in terms of what these disruptions make visible, not in terms of how things actually operate, but in terms of the gulf between how things actually operate and the ways in which many people think that things operate. It is an opportunity to reflect upon the questions that pop up in these moments.
Thus, there are several important elements that should be considered, including: Facebook’s role as curator, whatever it is that Facebook truly is, who is responsible for Facebook, and the faith in the objectivity of algorithms. These shall be considered in turn – though it is worth noting that there are many issues beyond these which are worthy of rumination. Nevertheless, before advancing further, it is worth avoiding the quicksand of hyperbole when thinking about this affair. After all, Facebook was not banning the content of conservative publications from its site, nor was it banning people from discussing such content – Facebook was simply keeping such discussions from appearing in its “trending” section. And though it is always worth viewing the claims of tech companies with skepticism, especially when they are trying to save face, it is worth bearing in mind that the Trending Topics section on a person’s Facebook feed is just one component in a much larger system. That a given topic isn’t appearing in the Trending box, does not mean that a person won’t find plenty about it appearing on their wall – and it doesn’t mean that person will be prevented from posting content related to it.
But, without further ado, into the breach…
Amidst what is turning out to be a rather contentious electoral season (to put it mildly) the news that Facebook was, according to one anonymous source, keeping conservative topics from trending is certain to have outraged some even while others chortled in a moment of schadenfreude. Yet the key takeaway from this should be recognition by those from all over the political spectrum that if this could be done to one political ideology…it can be done to a different one. While the article in Gizmodo implied that the suppressing of certain views is related to the biases of the curators, it is not too grand a logical jump to believe that different curators will just bring in different biases. The point is not to argue for some mythical “perfectly objective arbiter,” and it certainly is not about arguing that what is needed is some kind of magically neutral algorithm (more on that later) – but to recognize that there are people involved, and people have biases. Even appeals to authority couched in directives that involve checking news sources like The New York Times or Fox News are still problematic – as such news sources are similarly not free from bias. Indeed, insofar as Facebook may claim that the guidelines it gives its curators do not allow them “to add or suppress political perspectives” what this largely amounts to is a vote in favor of the status quo. Which is still a political perspective. The list of media outlets Facebook tells its curators to consult may not feature many “really right wing” sources…but it is also missing any “really left wing” sources. The list is predictably banal. And though the focus is on Trending Topics in the US, it may be worth considering the degree to which Facebook would be willing to happily suppress particular perspectives whilst operating in a country with a more censorious government.
Yet, once more, it is worth taking a moment to pause if only to recognize a simple fact: Trending Topics are personalized. The topics that you see are going to be different from the topics that your great uncle will see and they are also going to be different from the topics that your friend with different taste in movies sees. Because the way in which Facebook populates a given person’s Trending Topics feed is based on all of the information about a given person that Facebook has already gathered. Therefore, if your Trending Topics isn’t filling up with articles about the wonders of pineapples (to give a purposely silly example) it may be because you’ve “liked” the organization “Pineapples are Terrible,” have listed “allergic to pineapples” in your bio, and have never clicked the “like” button on anything pineapple related. In other words: not seeing conservative news in your Trending Topics feed? Well, it may be because all of the information that Facebook has gathered on you thus far suggests to them (or their algorithms) that you are not conservative – and therefore that you probably aren’t going to be interested in those stories.
The problem that emerges here has to do with the question of which topics are made more widely visible. And here it genuinely is worth recognizing that Facebook does have the ability to determine which topics are going to appear and which are not going to appear. Indeed, it may be particularly relevant here to look beyond the headline of the Gizmodo article to note that it also suggested that Facebook suppresses stories about Facebook. Consider the following: Facebook launched Trending Topics in early 2014, in June of that year the site came under fire when it was revealed that they had been conducting emotional experiments on users – do you remember how that story was represented in Trending Topics? Was it represented? Or was that story conveniently kept from the Trending Topics section on Facebook as placing it there might have been embarrassing to the company? Many, many (many!) people use Facebook and having control over what is presented to these people as Trending places a great deal of power in Facebook’s hands. And even if the company vows (vows!) that they are not manipulating the Trending Topics, it is worth bearing in mind that the company certainly could be manipulating these topics. Who knows, maybe the company wants to see what kind of emotional reactions people have when the Trending Topics are flooded all with good news or all with bad news? Nevertheless, what this affair makes abundantly clear is that whether they do it with curators or algorithms, Facebook has become an important arbiter in determining which topics get placed before its users. Facebook has the ability to potentially to direct a ton of traffic to a given story. Which brings us to the next matter to consider: what the heck is Facebook these days?
The obvious answer to this question is that Facebook is a social media platform. Right? Right. But usually the focus has been upon the “social” aspect of that equation, while at the moment “media” is the element coming to the fore. The “social” has been the world of “likes,” “pokes,” vacation photos, and reminders to write “happy birthday” on the wall of that distant acquaintance who you only interact with by way of annual happy birthday messages. We’re accustomed to thinking of Facebook as a “social” company; however, it is increasingly important to think of Facebook as a media company as well. And a darn powerful one at that. Chances are that there are more people on Facebook than watching CNN (to give a random example) – so what Facebook deems “trend worthy” has quite serious implications. The challenge, of course, is that people tend to have different expectations of media companies than of social media companies. People expect the New York Times or Fox News or [Reader! insert the news company of your choosing here] to provide updates on the latest, important news in a manner that at least has a patina of objectivity. True, people recognize that these news companies have to make choices about what they report at a given time, but (wrongly or rightly) there is a general perception that these companies are motivated by an obligation to keep people informed of the important matters of the day.
Does Facebook have any such obligations? Alas, not really. Facebook may want its Trending Topics feature to give the impression that people are using Facebook to chat about serious issues – but Facebook isn’t motivated by a desire to keep people informed. Rather, Facebook is motivated by the desire to keep Facebook users on Facebook. After all, if you’re discussing the news on Facebook than they can mine that discussion for data and sell it to advertisers. And in providing you with a “personalized” set of Trending Topics Facebook is making less of an editorial decision about “this is important!” and more of a gesture in the direction of “based on your past behavior, we think you’ll like this.” Facebook is a social media company that has become tremendously successful by warping complex social relations and turning them into a stream of simplified “likes” – why should it be any surprise that the company would approach the responsibilities of being a media organization in a similar way?
Granted, a problem in discussing Facebook is the fact that one continually winds up saying/writing things like “Facebook does this” or “Facebook is that” – but who is Facebook? Who bears responsibility for it? Is it the prominent individuals affiliated with the company whose names you actually know? Is it the legions of users without whose unpaid labor the platform would be worthless? Is it the company’s board of directors? Is it the curators who work as members of the trending team – who are actually contractors? There isn’t a simple answer to this question. It seems that much of the current hubbub is a result of those curating contractors…but they also seem to have been (more or less) just following directions handed down to them. If these questions were to be asked of a more traditional news organization it might be tempting to think in terms of the chief editors and the organization’s owners – so perhaps there is something to be said for thinking of the prominent individuals whose names you actually know. Individuals like Mark Zuckerberg do not lack their own political ideas – and tech companies have not exactly been sitting out the game of doling out money in political donations. Indeed, under the guise of charitable giving, Zuckerberg has recently put up a heck of a lot of money for the purposes of advancing his own beliefs. The point is not to single Facebook out, but to recognize that there really is a sort of baseline ideology underlying the platform – all that talk about openness and connectivity sounds like schlocky rubbish but it’s also apiece with the company’s techno-utopian neoliberal ideology. Or, to put it slightly differently, the ideology of Facebook consists of the idea that Facebook (and the bevy of companies it owns) is the solution to the world’s pressing problems – it’s bringing people together! It’s connecting them! But the Trending Topics travail shows that for all of its talk about being the solution, Facebook really just perpetuates the problems. Instead of bringing people together and encouraging a rich conversation about important topics, Facebook allows people to stay in their safe ideological bubbles with a set of Trending Topics catered to their individual needs by an algorithm.
Lastly, it is worth recognizing the simple fact that algorithms are not neutral. This bears repeating: an algorithm is not really any more neutral than a human curator. Why not? Because algorithms are created by humans, and those humans (intentionally or not) program their (and their employers’) biases into those algorithms. After all, it’s possible to create an algorithm that is instructed to ignore content related to a particular political ideology. One could easily replicate this whole affair by replacing human curators with algorithms that had been programmed to ignore certain news sources – granted, algorithms rarely come forward to serve as anonymous sources for Gizmodo articles (but I digress). Determining what is and isn’t newsworthy is never a “neutral” task, regardless of whether this determination is being made by a person or a program. Insofar as algorithms appear to be more objective this is largely a result of the way in which they are blackboxed – people don’t know what is going on inside these algorithms, and if they are the proprietary property of a large corporation it may be impossible for individuals to know how they operate.
Even if Facebook (as may happen now) gets rid of its human curators and replaces them entirely with fancy algorithms this will not mean that this matter has been conclusively solved in an objective way – for these algorithms will simply replicate and reify the various biases that are built into them. And even should these algorithms be created in such a way that allows them to “learn” the question will remain as to how it is that these algorithms are learning. Indeed, it is quite likely that such algorithms are constructed to tow the line of Facebook’s ideological biases – and while you may personally find Facebook’s ideology non-threatening (or you may have a different reaction) this does not mean that it is neutral. Indeed, one of the most odd moments in this whole kerfuffle about Trending Topics is seeing people express consternation that humans were making some of these decisions, when many thought it was all done by algorithms. But, as has already been mentioned, algorithms could have easily been programmed to give the exact same results – or similar ones. In the end one of the most interesting things the Gizmodo article revealed is the uninformed faith that many are willing to place in algorithms. The belief that technology is basically neutral is a comforting thing to think in the midst of a technological society, but it does not mean it is true.
Media literacy involves teaching individuals to approach the news and content put forth by traditional forms of media with a critical and thoughtful gaze. And if there is one key thing to learn from Facebook’s Trending Topics tribulations it is that, in the present day, media literacy must also entail social media literacy. And a good first lesson to keep in mind in this regards is Fromm’s, previously stated, warning to remember that:
“what we use is not ours simply because we use it.”
Fromm, Erich. The Fear of Freedom. Routledge Classics. London: 2001. The quoted line appears on page 225.