[Kirjoitus on myös suomeksi]
Recently, even mainstream politics has finally begun to wake up to the fact that the ongoing environmental catastrophe cannot be overcome by superficial twiddling alone, but that consumption of all kinds must be reduced at all levels, and lifestyles must change along with it.Digital technology is often seen as a kind of saving angel that will enable the economy to decouple from material consumption. But this is an illusion. The storage, transimission and processing of every bit requires matter and energy, and when there are huge masses of bits, the consumption of matter and energy is also huge - and most of this consumption is unnecessary for the functioning of societies and the quality of people's lives.
The evolution of the material and energetic facilities of computing over the last half century has been described by Moore's Law, which says that the physical size of transistors halves every two years. This development has made possible, for example, cheap personal computers, but has also instilled in the industry an attitude that it is okay to waste resources - after all, in a few years there will be twice as much of them. This indifference has also provided an excuse to shorten hardware life cycles, thus multiplying the environmental problems associated with electronics industry and e-waste.
Now, the physical limits of downsizing are being reached, and Moore's Law has been predicted to end by 2025. With the current trends, this would mean that global energy consumption of information technology will explode unless attitudes towards the technology are changed by both its developers and users.
Current trends include, for example, the rapid growth of data traffic, an increasing proportion of which is due to the streaming of video content (IEA). To date, the energy efficiency of networks has increased at a roughly similar rate as the traffic, which has allowed its energy consumption to remain fairly stable, but by 2025 this "balance of terror" could be a thing of the past.
To correct the attitudes, we need visions of how things could be different, but especially an understanding of what specifically needs to be fixed. In this text, I will focus on the latter.
I will identify two "isms" - Maximalism and Virtualism. Maximalism is the self-serving glorification of growth and abundance, while Virtualism is the obsessive hiding of 'dirty' things out of sight and out of mind. Both phenomena are present in society and culture at large, but in computing they have been able to gain a lot of ground.
Maximalism is my own concept here. It is not related to, for example, artistic or political notions of Maximalism, but is more akin to the idealisation of endless economic growth.
In computational esthetics, Maximalism manifests itself especially as the estheticisation of size and abundance. For example, images contain more and more pixels, colors and details, which require more and more capacity for storage and computation. Even when a technological invention allows us to do more with less - for example, by better video compression codecs - this invention is used, in line with Jevons paradox, to accelerate quantitative growth - doing more with more.
I also associate Maximalism with the notion that technological progress is inextricably linked to quantitative growth and is not possible without it - in the same way that mainstream economic thought imagines GDP to be directly linked to welfare. This belief has led to a disproportionate search for new IT breakthroughs in areas opened up by increased computing speeds and memory capacities.
As a recent example of the push for scale in research, consider language models for natural language processing: in 2018, the largest models such as GPT-1 had just over 100 million parameters; two years later, the GPT model had ballooned to 175 billion (GPT-3); and at the time of writing, the largest language models already have over one and a half thousand billion parameters (Google's Switch-C). The amount of computation involved in deep learning neural network research has doubled every couple of months - between 2012 and 2018, there was a 300,000-fold growth. And this was only the research part - once all sorts of entities start tailoring huge neural networks for all sorts of purposes, we can expect quite an explosion in energy consumption.
Another example of Maximalism is provided by cryptocurrencies such as Bitcoin, whose value systems are based on maximally efficient vanity computation - in effect, a maximal waste of energy. Although cryptocurrencies are not yet used by large masses, the "mining" of Bitcoin alone currently consumes around 110 terawatt-hours per year, close to the energy consumption of the entire Netherlands in 2019. Most crypto-mining currently takes place in China using coal-based electricity.
Sometimes resources are wasted just because their abundance eliminates worries about even the most ridiculous overuse. Add to this the software industry practices and tools that encourage this indifference, and you have a huge amount of bloat in software: it is not at all uncommon, for example, for a browser-based application to be a thousand times larger than an exactly identical application would be if it were made properly. And when browser applications are constantly being reloaded with page loads, the amount of unnecessary traffic can end up being quite high.
Since the 'bit world' maximizes the use of the kind of resources whose physical manifestations are constantly shrinking, it has been possible to relate to the growth in ways that, when adapted to the material world, would be roughly equivalent to the limitless and explosive expansion of human civilization into outer space - which happens to be an ideological vision of future popular in Silicon Valley.
In everyday computing, Maximalism is concretely reflected in the way in which, for example, the content shared on social media is bloating up. Images attract more attention than text alone, and video clips attract more attention than still images. The service may even reward those who prefer to share videos – perhaps because video-watching is considered addictive. So even social media platforms are competing at whose service is the most maximalist.
Maximalism is often connectede to the pursuit of photorealism, for example in video game graphics. Photorealism is a relatively unimaginative goal - everyone knows what reality looks like - and at the same time impossible: the sharp-eyed will learn to distinguish even the most faithful imitations from the real thing. This makes it convenient for the market: it is easy to make the "current generation" of reality-imitation look outdated by contrasting it to the "next generation", and this contrast also serves as a way to sell new microchips and devices.
The Maximalism of the games industry is also reflected in changes in language. For example, in the 1980s, the word "gaming machine" (Finnish: pelikone) was often a pejorative - referring to a machine that was not suitable for serious use. Games were therefore an application that even the smallest microcomputer was capable of. In the 21st century, a 'gaming machine' is more likely to be expensive, powerful and in need of constant upgrades. Similarly, we might consider the term 'good graphics', which in the past referred, for example, to skilfully drawn and esthetically pleasing pixel images and well-programmed graphics routines, but which in the 21st century has come to refer more to the technical graphics capabilities of the hardware and whether a game supports them.
There have even been attempts to take steps towards photorealism in areas where it is poorly suited, such as user interfaces. For example, in 2001, Apple's interface design guidelines talked about a 'photo-illustrative' icon style that 'approaches the realism of photography'. Later on, Apple softened its recommendation and began to emphasize simplicity and clarity.
Maximalism and planned obsolescence feed each other. When technologies from different 'generations' can be reduced to one-dimensional numbers, where bigger is always unequivocally better in every way, there is no justification for the existence of the old and it is easy to throw it into the waste heap. 4K is replaced by 8K, 4G is replaced by 5G. Despite the fact that there is already an unimaginable abundance of resources, this abundance shows up as temporary and inadequate.
Quantitative growth is often accompanied by an increase in complexity. In order to build and manage increasingly complex systems, there must be ways to manage the complexity. For example, when managing large numbers of people, it is practical to blur their individuality and think of them as non-differentiated 'gamepieces'.
In computing, complexity is typically hidden behind layers of abstraction, making the whole appear simpler and 'cleaner'. An abstraction may also give a misleading impression of what it represents - for example, a virtual server acts like a physical server for programs and users, even though it is only a software imitation of one.
Like Maximalism, Virtualism is also present in society in general, but there are exceptionally good conditions for it in computing. The Church-Turing thesis, one of the most fundamental truths of computer science, states that all 'universal' models of computing are equivalent to each other, i.e. in theory any computer can perform any task, given enough time and memory. A powerful machine can thus stack a huge number of layers of abstraction on top of each other, each of which disguises the layers underneath as something else.
'Virtualism' means here a way of thinking in which hiding and disguising the complexity of a system, or its inner workings in general, is self-serving and extreme. Details that the user 'doesn't need to see' are considered dirty and ugly, something that need to be hidden from them even when they might actually be relevant. For example, the user of a smartphone application may not even be given the opportunity to know which of the files they are using are stored on the device itself and which are on an external server. Virtualism also extends to the physical appearance of the phone - its monolithic design and lack of moving parts keep its internal workings as hidden as possible from the user.
In digital esthetics, Maximalism and Virtualism support each other. Virtualism seeks to obscure the fact that an image is made up of pixels, and this is best achieved by maximizing the pixel density. The ideal is a completely mirror-like, undisturbed and noiseless communication channel, whose existence, let alone its characteristics, cannot be detected at all. This ideal, like photorealism, can be approached indefinitely, for example by increasing the number of pixels and the bit rate of the video stream, because the discerning eye will always find something to improve.
Typical virtualistic imagery (the kind you see in Google image searches such as "virtual" and "digital") evokes images of immaterial floating and clinical purity: patterns and figures made up of light strands and dots float immaterially against a blue background, and if there is a physical environment, it is artificial, sterile and often also blue.
Virtualism also appears in IT metaphors. For example, the word 'cloud' refers to large, energy-hungry computer rooms that rent storage and computing capacity to customers, commoditized so that they don't have to worry about physical servers. The word "cloud" summons images that are as far away as possible from concrete computer hardware consisting of separable parts. It is something homogeneous, gaseous and ever-changing, something that is pointless to even try to compartmentalize in one's mind, but which, on the other hand, brings peace of mind as it floats against a blue background.
There are also the structures that encourage people to virtualize themselves. In mainstream social media services, users adapt to tight moulds in order to comply with the interface specifications defined by the owner. Jaron Lanier, one of the early developers of virtual reality, criticized the phenomenon in the book 'You are not a gadget' back in 2010, and even I wrote about it around the same time, but it seems that even today people are still mostly unaware of how the 'game mechanics' of different services drive their behavior.
Above all, Virtualism causes alienation. People become alienated not only from the technology they are using and its tangible material framework, but also from reality in general and even from themselves.
Maximalism and Virtualism are so deeply embedded in computing that it can be difficult to separate them from the whole. Even I myself might not see them so clearly if I had not gotten into the demoscene, where the relationship with technology is in many ways the opposite: hardware never becomes obsolete, there is a profound focus on its tangible aspects, and doing spectacular things with as few resources as possible is valued above everything else.
Technologists have often wanted to be 'on the right side of history', and perhaps that is why many have found it difficult to acknowledge the contribution of information technology to environmental problems. 'Sustainable ITC' as a field of study only emerged in 2007, and even the mainstream has typically ignored both the root causes of the problems and the existence of planetary limits. Frustrated by this, some researchers started organizing events called Computing within Limits in 2015, where problems are tackled in ways that acknowledge reality, but it is still a small fringe. A radical, large-scale change is urgently needed.What about those who have no influence on policy or technological developments? Perhaps all they need here is some general advice that applies to many other issues as well: be aware of the problems of both material and 'immaterial' consumption, remember moderation in all things, and get out to the nature once in a while.