Arie Altena
index

Technophobia and Technomania

‘Everything that is set-up, can be broken down.’ (Bruno Latour, 2012)

Arie Altena

This text was written for the project Technomania: Opening Up Technologies in 2012. A Dutch translation was published in Gonzo (Circus), if I'm not mistaken.

Technomania, getting lost in a technological high. Technophobia, being afraid technology will take over one’s life. Our lives are ever more intimately connected to technologies – but are we actually taking care of it?

About halfway in his most recent book  An Inquiry into Modes of Existence the French philosopher and sociologist Bruno Latour, one of the most astute thinkers of our technological times, states: “When we talk about a ‘technological infrastructure,’ we are always designating a more or less patched-together mix of arrangements from more or less everywhere that others seek to render irreversible by protecting it from analysis, making it a carefully sealed and concealed black box.” (Latour, 2013, p. 213-214). This particular description might be a bit abstract, but it’s quite a good as a definition of technology. Technology is always made-up. Technology is always an arrangement of many things: political interests, cultural issues, economy, human behavior. Technology never is made of technological things only. There is no such thing as pure technology. Technology is an assemblage of different interests both human and non-human.

Take your mobile telephone. Try to describe the different relations on which it depends (the electricity grid to charge to battery, the different layers of communication protocols to make it function), open it, and trace the origin of the parts, the materials of which they are made, the precious metals, think about the work that went into designing the chip. It’s not just ‘technology’ – it’s economy, culture, politics, mining, industry, war, years of scientific progress (and bickering between scientists), and what else. All of this is nicely packed inside a little tool to call people, send texts, make photos.

When a technology becomes successful, it becomes – at least to some extent – a ‘black box’. A shiny gadget that’s easy to use and functions well, because the internal complexity is hidden. All of the work that went into it and all of the different interests that it assembles, are ‘forgotten’. The patched-up nature of the technology is concealed. We forget that it could have been made differently, and that it can be changed. That is why a technology can seem to be a force outside of us, an autonomous thing that threatens us, or puts a spell on us. A thing to fall in love with, of for which we are afraid. A case for technomania – as extreme form of technophilia – and technophobia.

The technomaniac is enraptured with a technology. There is a superficial, consumerist form of technomania: being seduced by shiny screens, and loving gestures of intimate interaction, attention glued to iPhone, direct interface to social life. But the real technomaniac is the geek. The geek knows that technology can be tweaked, pushed to the limit, and changed, but he or she is also seduced, and forgets about the entanglement of technology with other aspects of life. In the most extreme form the geek looses the sense of time and space in a black hole of continuous programming. 

A technomaniac usually believes technology is ‘pure technique’ and that because a technology is technically superior, it should prevale. Like the ‘guys’ of the Pirate Bay who assume that because something can be done by technological means (large scale ‘free’ sharing of movies and music), society should follow in its footsteps. They think technology is ultimately free of attachments to politics, law, economy, culture, human behavior. (For this attitude see the documentary Pirate Bay, Away From Keyboard, 2013). A technomaniac uses a technology as if it exists in a vacuum. In the sociological techno-mystery Aramis, or the Love of Technology (1993) Latour shows how an innovative concept for a metro (‘Aramis’) fails, not because the technology itself is flawed, not because of unwilling ‘politicians, but because the different actors fail to negotiate and adapt the project to changing situations. A technomaniac would blame the politicians for not understanding the technology.

A person who suffers from technophobia has the impression that technology has become too complex to manage. He feels he can’t keep up. The technological innovation seems unstoppable and inescapable. An autonomous process that keeps evolving for its own sake, and threatens to completely change the familiar world. An outside force that is going to destroy all precious things. (“The internet, this terrible monster, destroys everything of value – attention, real friendship, privacy, our printed magazines”). Technophobia can indicate an awareness of how technologies are changing the attachments in society. (How the internet, smart phones, ‘social media’, and localization indeed change the use of time and the feeling for space, and reconfigure social behavior).

Technomania and technophobia are extreme positions. Neither is by definition completely bad or misguided. Technophobia can be justified in some respects. Technomania can have a function – at least through technomania people learn to use a technology. Both suffer from blindness. Both take technology as an autonomous, outside force. For the technomaniac this is a force to be immersed in; the technophobic person feels threatened.

It sure can feel as if technological arrangements are invisibly ‘forced’ onto us. Over the past twenty years many people embraced technologies without being aware that this implied a fundamental transformation of society (and culture, economy, politics). To many it might feel as if we have been often kept on purpose in the dark about these implications. The truth is: not many people were really interested in those implications. They were to eager to start using the tools, and forgot about the rest. There is definitely a deficit in knowledge about the technologies we use. Technologies have become magical devices that somehow function, and in many consumer technologies agency is increasingly taken away from the user, to become intelligently black-boxed in a ‘smart’ closed app.

Smart phones, ipads, iphones, they are made for consumers, and they turn us into unthinking consumers. Make a photo, upload automatically to Instagram, share with friends. Push one button. Which is fine. The problem of the consumerist attitude is that the technologies that shapes our lives are not really seen as tools to create. (You can make a photo, which usually looks pretty much the same as thousands of others, even if you try to be artistic). It’s as if the technologies are a given, and cannot be unmade, remade, disassembled and put togetheter differently. This attitude has technomaniac (enamoration with the smooth interface), and technophobic aspects (not wanting to understand the tool, not daring to open the black box.) The consequence is also that the technology does not feel to be our own. It’s like it is owned by ‘them’, we have only been given a license to use it under ‘their’ rules. (Sometimis this is actually the case).

These tools are not really simple – they are intimately entangled with our lives, in very complex ways, and our lives become ever more and ever more intimately entangled with them. What is to come? Sensors on your body register not only your location, but also physical data like heart-rate and adrenaline levels. (What do you think the current popularity of electronic lifestyle coaches will lead to?) The new cars continually transmit where they are. Google has a record of your search behavior, which furnishes important information. (Did you upload the data from your last training already to Strava?) Different data-mining companies have made profiles based on your user-data. How valuable all this is for insurance companies – it will be quite clear if you’re part of a high risk group or not! The algorithms that dig into these Big Data and Deep Data become ever better at their task. And the more there is to analyze, the better they become at predicting future behavior, and giving out warnings to you – if you’d happen to be up to something. (You might not even know that yourself yet).

We already live in a world where we consciously and unconsciously continually leave traces, and those traces will be saved, read, compared and analyzed. It’s a misunderstanding to think of these developments as a massive invisible ‘invasion’ of our privacy. (Privacy regulations are important, but it is not very productive to think about this development in terms of privacy only). Leaving traces simply comes with the tools we use. Are you able to access these data yourselves? Are you allowed to build your own ‘tools’ with it, extract meaning out of it, share it with others, build applications around it? If we care about our technologies that transmit data habitually as part of their functionality, we also should care about what happens with the data.

In a recent text for the German magazine DeBug, Sacha Kösch wrote that it is even not very productive to think in terms of ‘my data’ (my data): “Anyone who wants to think seriously about surveillance, must stop to think in terms of ownership of data – one’s own presence has already generated data – and one also has to stop assuming that the internet is the main problem. You have to become aware of the fact that you always produce data and the data will always be read by someone. It just depends on knowing exactly how you can fight against it, and with what means.” (Sacha Kösch, 2013, p. 15, my translation) Can we use the data ourselves?

To see something as a technology should mean that we see it as something which is made, which is constructed, and thus can be deconstructed, unmade, remade, added to, changed and built upon. What is set-up, can be broken down. To see something as a technology should mean that we recognize it as something which holds the potential to create – and make things happen. It should mean that we recognize the technology as something which belongs to us – and not to ‘them’. In principle we never have to (or should have to) blindly and passively accept the technologies as they are ever more seducingly presented to us. (That’s a tricky one in a world where for instance Google and Facebook have amalgamated enormous power due to their amount of users – and seduce us with flawless functional tools for which we pay by giving away our personal data.)

It is crucial to remember that technology is never only technical: a technological arrangement is a mesh, or a mess – if you like. It is impossible to separate the technological from social aspects, from morality, law, politics, culture, aesthetics, religion even (as Latour likes to stress). The issue is not to try to purify all those relations, and get at ‘pure technology’, ‘pure politics’, et cetera, the issue is to realize how we are attached to it. What are our attachments, how do we attach to it, and how would we like to be entangled? How do we become ever more intimately attached?

In another recent text Latour writes: “But our sin is not that we created technologies but that we failed to love and care for them.” (Latour, 2012). This is also the moral of his book Aramis, or the Love of Technology. The sin of Frankenstein (in the tale of Mary Shelley) is not that he made a monster. The sin is that he did not care for it. We need to care for the technologies we created, the technologies that have rearranged social ties, politics, the economy, our culture, our lives.

To come back to technomania and technophobia: maybe a little bit of technomania, a short period of blind love, can change technophobia into technophilia, and then into a more realistic attitude, in which we start to take care of our technologies.

References

Sacha Kösch, “Big Brother in der Umkleide. Überwachung und Alltag”, DeBug 175, September 2013

Bruno Latour, Aramis, or the Love of Technology, Harvard UP, Cambridge Mass., 1993.

Bruno Latour, ‘Love Your Monsters’, http://thebreakthrough.org/index.php/journal/past-issues/online-content//the-monsters-of-bruno-latour/, 2012

Bruno Latour, An Inquiry into Modes of Existence, Harvard UP: Cambridge Mass., 2013. See also http://modesofexistence.org/index.php/site/index

some rights reserved
Arie Altena
index