LibrarianShipwreck

"More than machinery, we need humanity."

Are We Technologically Literate?

Some of the appeal of new technologies is the ease with which they can be used. Though we may recognize, on a subconscious level, that these devices represent impressive feats of mining, engineering and assembly it does not require a great deal of work on the part of a user to make these devices work. Tablets and smart phones are charming in their simplicity, likewise many of the apps and platforms with which people have become most familiar feature a similarly low bar to use. Granted, that these devices would have such ease of use was a clear goal of the companies and the designers/engineers they employ.

In a society awash in high tech tools people have become accustomed – and even appreciative – of the way such devices seamlessly integrate into daily life. Though there are the occasional grumbles about a bad wi-fi signal, an app that is acting oddly, or the way that a newly unveiled product makes current ones look like fossils – for most people there is a sense of control over the devices they use. They know how to make them work.

Yet there is a difference between knowing how a device works in a narrow functional sense, and understanding how it works over a person and society. To put it another way, knowing how to use a device does not necessarily mean that one understands how that device uses them back.

It is this dissonance that bubbles to the surface whenever the news is filled with some “shocking” tale in which one of the major characters is technological systems. The ongoing paranoia around privacy that has made international headlines thanks to Edward Snowden’s revelations is largely a story about the ways in which modern technology massively empowers nation states desiring to “collect it all.” Similarly, the current anger at Facebook for having subjected nearly seven hundred thousand users to a psychological – emotional manipulation – experiment, is less a tale of ethics in experiments than it is a story about a powerful tech company blithely manipulating the tools at its disposal. These are two particularly glaring examples but it is easy to expand the list: automation concerns, big data, Google Glass, the emphasis on “quantifiable” data, the “Internet of things,” the “sharing economy,” any time you hear the word “disruption” used…these are stories that feature a common character: technology. And what all of these issues reveal is that for all of our societal love of technology, for all of the confidence with which we download new apps and purchase new devices…

We are far from being technologically literate.

This may seem an odd claim. After all, there is no lack of evidence that people know how to use technology, but what is meant by this is to suggest that we know how to read what it says on the screen (sometimes) but we do not know how to read actually read the technology. We may know how to use a smart phone – for example – but do we really understand the information it gathers, the information it sends, what it can legally do after we click “agree” on the terms of service, why some apps are available and others are not? Furthermore the issues of this reading are all based on a notion that focuses too greatly on the level we are at in regards to technology – when we look at the device do we consider: the mines from which the minerals were dug, the assembly plant in which the device was put together, the e-waste dump to which the device will be sent when we have discarded it? Our technological logic is sorely lacking – not because we do not understand how to use technology – but because focus on the use of devices has hidden the larger questions from view.

When we think about the powers of new technology, we rarely consider how these powers may be abused.

That technology is considered an important topic in circles concerned with education should be a fairly non-controversial claim. Every week seems to bring more stories about funding for Science, Technology, Education and Math (STEM) programs or a new initiative in which a major tech firm hosts a party where they try to encourage more people to get involved with programming. While there is certainly an argument to be made that STEM programs are useful and that more people should get involved with programming…anybody with a basic knowledge of the natural sciences will recognize that there’s little point emphasizing the STEM if the roots have been neglected to the point of death.

The technological troubles of today are not a result of people lacking STEM training, but a result of the way that the STEM has ignored the roots. The Facebook experiment is what happens when technological powers are emphasized to the detriment of concerns that tend to fall outside the aegis of STEM. In fairness, the Facebook experiment was an interesting study that proved a hypothesis (Facebook can successfully manipulate user emotions) – if one focuses only on the technological side than this is a success; however, when one considers it from the stance of ethics one is shocked by the gall and hubris of the researchers and company who thought that they could perform such a study. While it is true that users had clicked “agree” – one cannot help but wonder where the voices at Facebook were who dared to say “isn’t this obviously manipulative, unethical, and more than a little bit creep?” Yet, it is but another reminder that statements like “making the world more open and connected” or “don’t be evil” have more to do with public relations spin than any deeply held ethical commitments.

The problem of technological literacy is a problem of education and learning. While it is very tempting to lay the blame upon tech firms or the state of schooling, to do so is to shift blame in much the same way as we have shifted responsibility to new technologies. We cannot delegate the duty to think critically, that is a responsibility that remains with each individual – there is no app for it. Rather, as the philosopher Simone Weil put it:

“Wherever human relations are not what they should be, there is generally fault to be found on both sides. But it is always far more to the purpose to consider one’s own faults, so as to put a stop to them, than those of the other party. Besides, the need is far greater on our side—at any rate the immediate need.” (204)

It is rather unlikely that apps/devices/platforms are going to start putting “tracks your every move and gives it to advertisers” or “subjects you to psychological experiments” or “is busily working to make you unemployed” or “is like a spy satellite trained on you at every moment” in their advertisements. Likewise, the STEM mania that has overtaken many schools combined with the disease of quantification known as constant high stakes testing –make it rather unlikely that students are going to be taught to technological literacy. STEM education seems often to be about turning out more engineers who will work for tech firms not preparing people to think critically about the power relations that have placed startling levels of power in the hands of those tech firms.

Yet, to restate, the challenge that faces us is not one of knowing how to think with technology but knowing how to think about technology.

Many of the stories we have been taught, and which we have been encouraged to repeat, about technology fit neatly within a certain notion of human progress that paints every new technological advance as a further step towards some technological utopia. Yet it does not take too great a stretch of the imagination to view some of the recent technological shifts and worry if they are more akin to Brave New World or 1984 or Terminator than they are to Utopia or somewhat utopian futures such as the one depicted in Star Trek.

What is generally put in the foreground when it comes to new technologies is the promise of technology – it will give you new capabilities, it will eliminate scarcity, it will free you from drudgery – but at this point it has become rather dangerous to unthinkingly believe this promise. The technologies we see before us – fantastical though they may be – emphasize their positive aspects while burying the elements that would make a person pause and contemplate if the trade-off is really worthwhile. Furthermore, it is a trade-off that functions by unequally distributing costs and benefits—the smart phone user rarely mined the coltan inside it, and they will rarely live next to the e-waste. Though the façade functions on many layers: pages and pages of terms of service agreements written in dense legal terminology are not meant to be read – that is why you can simply check “I’ve read it” and then click “agree.” And if a person had read it there is no guarantee that they would have been able to spin out the logical conclusions – such as the agreement being equivalent to giving informed consent to participate in a psychological study; whilst these terms of service agreements are revised with great frequency and subtlety.

The technologies we encounter are not neutral things that popped up as the result of a natural biological process, nor are they a happy bequest bestowed upon us by benevolent benefactors – they are the result of a clear set of ideological biases and decisions that have guided these devices from the drawing board to your bedside table. And though these tools may allow us to feel an increased sense of power and control it should be glaringly obvious – from the NSA to Facebook – that the power these devices give an individual is microscopic in comparison to the power that is bestowed upon the groups behind these devices. All the while one must also keep in mind the many human beings involved in the creation of these devices who never get to play the latest app or enjoy a massage at a tech company campus – miners, assembly plant workers, e-waste recyclers.

When we look at technology we must learn to see all of these layers of interlocking issues amongst which our usage is just one small part.

It is not enough for us to know how technology works, we must know what types of work it has required around the world, and how it works us and our society over. This is not to argue for a wholesale rejection of technology, far from it, but it is to argue that we need to establish a critical relationship with technology as it (and those driving it) seem intent on a wholesale rejection of any ideas and values that do not correspond to technological logic. The power held by technology today is not something that has “suddenly happened” which is why a “shocked” response always seems vaguely comical – for thinkers have been warning against the dangers of unbridled technological control for a long time. Though the following words were written in 1952, there is an unnerving parallel between Lewis Mumford’s prediction and our current predicament:

“We have lost the essential capacity of self-governing persons—the freedom to make decisions, to say Yes or No in terms of our own purposes—so that, though we have vastly augmented our powers, through the high development of technics, we have not developed the capacity to control those powers in any proportionate degree. As a result, our very remedies are only further symptoms of the disease itself. Our technics has become compulsive and tyrannical, since it is not treated as a subordinate instrument of life,” (136/137)

The challenge that Mumford set out over fifty years ago has only become the more dire for us today. The technological “powers” we enjoy now are even more “vastly augmented” than those of which Mumford wrote, and yet still we “have not developed the capacity to control those powers.” Alas, we may be increasingly subjected control by “those powers.” A key first step to reestablish autonomy and control for ourselves is for us to redevelop a critical distance from the technology that has infiltrated every area of our lives – and for us to learn how to read the devices that surround us – not in terms of what it says on the screen but in terms of the meaning hidden behind the screen.

Knowing what information an app collects is as important as knowing which buttons to push. Knowing where the minerals in a device came from (and under what conditions they were mined) is as important as knowing where the off switch is. Knowing what you have agreed to in the terms of service is as important as knowing how to share pictures with friends.

We need to become technologically literate.

All of us.

Not so that we can learn how to write code, but so that we can learn to decode the systems that define our lives.

We will be shocked by what we learn.

Works Cited

Mumford, Lewis. Art and Technics. (Columbia University Press, 2000)

Weil, Simone. The Need for Roots. (Routledge Classics, 2002)

Further Reading

Facebook Gets Emotional

Riddled With Questions – Interrogating Your Technology

A Pyramid of Technological Control

Whose Vision of the Future of This

Luddism for these Ludicrous Times

“We Still Carry on Thinking”

About Z.M.L

“I do not believe that things will turn out well, but the idea that they might is of decisive importance.” – Max Horkheimer librarianshipwreck.wordpress.com @libshipwreck

37 comments on “Are We Technologically Literate?

  1. EsBee
    July 6, 2014

    I really enjoyed this article. Do you recommend any resources to become technologically literate? I have been doing some research myself, but would appreciate your insight.

    • TheLuddbrarian
      July 9, 2014

      Greetings,

      An excellent – if tough – question!

      As I do not want to burden you with a massive list I’ll just offer two suggestions:

      Technopoly by Neil Postman. An excellent overview of some of the challenges that technology presents, this book does an excellent job of raising insightful questions.

      Digital Rubbish by Jennifer Gabrys. This is a superb book that delves into the material reality of technology (mining, e-waste, etc…). Plus, the book is available free through the publisher’s website.

      – The Luddbrarian

      • EsBee
        July 9, 2014

        Thank you for your response! I will check these resources out! I enjoy your blog!

  2. Pingback: A Threatened Net or a Threatening Net? | LibrarianShipwreck

  3. Ayush Kumar
    July 7, 2014

    Excellent! I also wrote an article today at The Kaleidoscope on the Facebook psychology experiment which was motivated more by a business mind rather than an academic mind. Here is link to the article : http://wp.me/p4Czjd-31

  4. Pingback: The Faucet Goes Dry in Detroit | LibrarianShipwreck

  5. Pingback: Bogus, but not the Bogeyman – The FCC and Net Neutrality | LibrarianShipwreck

  6. Pingback: ¿Estamos alfabetizados tecnológicamente? | blognooficial

  7. Pingback: Can We Have Our Cake without Soylent Goo? | LibrarianShipwreck

  8. Pingback: The Laboratory or The Library? | LibrarianShipwreck

  9. Pingback: Response Ability or Responsibility? | LibrarianShipwreck

  10. Pingback: Concerned about Privacy? Consider going Postal! | LibrarianShipwreck

  11. Pingback: Hope Comes Pre-Installed | LibrarianShipwreck

  12. Pingback: A Mechanical Moses | LibrarianShipwreck

  13. Pingback: For Want of an Epiphany | LibrarianShipwreck

  14. Pingback: Back to School…With Much to Learn | LibrarianShipwreck

  15. Pingback: “The attempt to keep conscience alive” – Reflections on the book Burning Conscience | LibrarianShipwreck

  16. Pingback: Speeding Towards a Slowdown | LibrarianShipwreck

  17. Pingback: Placing Bets and Asking Questions | LibrarianShipwreck

  18. Pingback: Not all Apples are Biodegradable | LibrarianShipwreck

  19. Pingback: “The Courage to Be Afraid” | LibrarianShipwreck

  20. Pingback: Tilting at Windmills | LibrarianShipwreck

  21. Pingback: Corrupting “The Corruption of Liberalism” – Brooks gets Mumford Wrong | LibrarianShipwreck

  22. Pingback: With Great Technological Powers…What of Responsibility? | LibrarianShipwreck

  23. Pingback: The Good Life or “the Goods Life” – The Thought of Lewis Mumford | LibrarianShipwreck

  24. Pingback: Facing Reality – Reflections on the IFG’s Techno-Utopianism Teach-In | LibrarianShipwreck

  25. Pingback: Accustomed to the Unacceptable – Harassment 2.0 | LibrarianShipwreck

  26. Pingback: Two Years of Steadily Sinking | LibrarianShipwreck

  27. Pingback: A Dark, Warped Reflection – An Analysis of Black Mirror | LibrarianShipwreck

  28. Pingback: Who Harvests the Apples? | LibrarianShipwreck

  29. Pingback: Laws Expire, Surveillance Remains | LibrarianShipwreck

  30. Pingback: The Illusion of “There’s an App for that” | LibrarianShipwreck

  31. Pingback: A Reality (g)Rift | LibrarianShipwreck

  32. Pingback: Ashley Madison and the Internet’s “Original Accident” | LibrarianShipwreck

  33. Pingback: What is the Problem to Which the Delivery Drone is the Solution? | LibrarianShipwreck

  34. Pingback: Google learns the Alphabet | LibrarianShipwreck

  35. Pingback: When it Comes to Social Media – Is There No Alternative? | LibrarianShipwreck

Leave a comment

Ne'er do wells

Archive

Categories

Creative Commons License