Tuesday, 5 August 2014

The resource leak bug of our civilization


A couple of months ago, Trixter of Hornet released a demo called "8088 Domination", which shows off real-time video and audio playback on the original 1981 IBM PC. This demo, among many others, contrasts favorably against today's wasteful use of computing resources.

When people try to explain the wastefulness of today's computing, they commonly offer something I call "tradeoff hypothesis". According to this hypothesis, the wastefulness of software would be compensated by flexibility, reliability, maintability, and perhaps most importantly, cheap programming work. Even Trixter himself favors this explanation.

I used to believe in the tradeoff hypothesis as well. I saw demo art on extreme platforms as a careful craft that attains incredible feats while sacrificing generality and development speed. However, during recent years, I have become increasingly convinced that the portion of true tradeoff is quite marginal. An ever-increasing portion of the waste comes from abstraction clutter that serves no purpose in final runtime code. Most of this clutter could be eliminated with more thoughtful tools and methods without any sacrifices. What we have been witnessing in computing world is nothing utilitarian but a reflection of a more general, inherent wastefulness, that stems from the internal issues of contemporary human civilization.

The bug


Our mainstream economic system is oriented towards maximal production and growth. This effectively means that participants are forced to maximize their portions of the cake in order to stay in the game. It is therefore necessary to insert useless and even harmful "tumor material" in one's own economical portion in order to avoid losing one's position. This produces an ever-growing global parasite fungus that manifests as things like black boxes, planned obsolescence and artificial creation of needs.

Using a software development metaphor, it can be said that our economic system has a fatal bug. A bug that continuously spawns new processes that allocate more and more resources without releasing them afterwards, eventually stopping the whole system from functioning. Of course, "bug" is a somewhat normative term, and many bugs can actually be reappropriated as useful features. However, resource leak bugs are very seldom useful for anything else than attacking the system from the outside.

Bugs are often regarded as necessary features by end-users who are not familiar with alternatives that lack the bug. This also applies to our society. Even if we realize the existence of the bug, we may regard it as a necessary evil because we don't know about anything else. Serious politicians rarely talk about trying to fix the bug. On the contrary, it is actually getting more common to embrace it instead. A group that calls itself "Libertarians" even builds their ethics on it. Another group called "Extropians" takes the maximization idea to the extreme by advocating an explosive expansion of humankind into outer space. In the so-called Kardashev scale, the developmental stage of a civilization is straightforwardly equated with how much stellar energy it can harness for production-for-its-own-sake.

How the bug manifests in computing


What happens if you give this buggy civilization a virtual world where the abundance of resources grows exponentially, as in Moore's law? Exactly: it adopts the extropian attitude, aggressively harnessing as much resources as it can. Since the computing world is virtually limitless, it can serve as an interesting laboratory example where the growth-for-its-own-sake ideology takes a rather pure and extreme form. Nearly every methodology, language and tool used in the virtual world focuses on cumulative growth while neglecting many other aspects.

To concretize, consider web applications. There is a plethora of different browser versions and hardware configurations. It is difficult for developers to take all the diversity in account, so the problem has been solved by encapsulation: monolithic libraries (such as Jquery) that provide cross-browser-compatible utility blocks for client-side scripting. Also, many websites share similar basic functionality, so it would be a waste of labor time to implement everything specifically for each application. This problem has also been solved with encapsulation: huge frameworks and engines that can be customized for specific needs. These masses of code have usually been built upon previous masses of code (such as PHP) that have been designed for the exactly same purpose. Frameworks encapsulate legacy frameworks, and eventually, most of the computing resources are wasted by the intermediate bloat. Accumulation of unnecessary code dependencies also makes software more bug-prone, and debugging becomes increasingly difficult because of the ever-growing pile of potentially buggy intermediate layers. 

Software developers tend to use encapsulation as the default strategy for just about everything. It may feel like a simple, pragmatic and universal choice, but this feeling is mainly due to the tools and the philosophies they stem from. The tools make it simple to encapsulate and accumulate, and the industrial processes of software engineering emphasize these ideas. Alternatives remain underdeveloped. Mainstream tools make it far more cumbersome to do things like metacoding, static analysis and automatic code transformations, which would be far more relevant than static frameworks for problems such as cross-browser compatibility.

Tell a bunch of average software developers to design a sailship. They will do a web search for available modules. They will pick a wind power module and an electric engine module, which will be attached to some kind of a floating module. When someone mentions aero- or hydrodynamics, the group will respond by saying that elementary physics is a far too specialized area, and it is cheaper and more straight-forward to just combine pre-existing modules and pray that the combination will work sufficiently well.

Result: alienation


The way of building complex systems from more-or-less black boxes is also the way how our industrial society is constructed. Computing just takes it more extreme. Modularity in computing therefore relates very well to the technology criticism of philosophers such as Albert Borgmann.

In his 1984 book, Borgmann uses the term "service interface", which even sounds like software development terminology. Service interfaces often involve money. People who have a paid job, for example, can be regarded as modules that try to fulfill a set of requirements in order to remain acceptable pieces of the system. When using the money, they can be regarded as modules that consume services produced by other modules. What happens beyond the interface is considered irrelevant, and this irrelevance is a major source of alienation. Compare someone who grows and chops their own wood for heating to someone who works in forest industry and buys burnwood with the paycheck. In the former case, it is easier to get genuinely interested by all the aspects of forests and wood because they directly affect one's life. In the latter case, fulfilling the unit requirements is enough.

The way of perceiving the world as modules or devices operated via service interfaces is called "device paradigm" in Borgmann's work. This is contrasted against "focal things and practices" which tend to have a wider, non-encapsulated significance to one's life. Heating one's house with self-chopped wood is focal. Also arts and crafts have a lot of examples of focality. Borgmann urges a restoration of focal things and practices in order to counteract the alienating effects of the device paradigm.

It is increasingly difficult for computer users to avoid technological alienation. Systems become increasingly complex and genuine interest towards their inner workings may be discouraging. If you learn something from it, the information probably won't stay current for very long. If you modify it, subsequent software updates will break it. It is extremely difficult to develop a focal relationship with a modern technological system. Even hard-core technology enthusiasts tend to ignore most aspects of the systems they are interested in. When ever-complexifying computer systems grow ever deeper ingrained into our society, it becomes increasingly difficult to grasp even for those who are dedicated to understand it. Eventually even 
they will give up.

Chopping one's own wood may be a useful way to counteract the alienation of the classic industrial society, as oldschool factories and heating stoves still have some basics in common. In order to counteract the alienation caused by computer technology, however, we need to find new kind of focal things and practices that are more computerish. If they cannot be found, they need to be created. Crafting with low-complexity computer and electronic systems, including the creation of art based on them is my strongest candidate for such a focal practice among those practices that already exist in subcultural form.

The demoscene insight


I have been programming since my childhood, for nearly thirty years. I have been involved with the demoscene for nearly twenty years. During this time, I have grown a lot of angst towards various trends of computing.

Extreme categories of the demoscene -- namely, eight-bit democoding and extremely short programs -- have been helpful for me in managing this angst. These branches of the demoscene are a useful, countercultural mirror that contrasts against the trends of industrial software development and helps grasp its inherent problems.

Other subcultures have been far less useful for me in this endeavour. The mainstream of open source / free software, for example, is a copycat culture, despite its strong ideological dimension. It does not actively question the philosophies and methodologies of the growth-obsessed industry but actually embraces them when creating duplicate implementations of growth-obsessed software ideas.

Perhaps the strongest countercultural trend within the demoscene is the move of focus towards ever tighter size limitations, or as they say, "4k is the new 64k". This trend is diagonally opposite to what the growth-oriented society is doing, and forces to rethink even the deepest "best practices" of industrial software development. Encapsulation, for example, is still quite prominent in the 4k category (4klang is a monolith), but in 1k and smaller categories, finer methods are needed. When going downwards in size, paths considered dirty by the mainstream need to be embraced. Efficient exploration and taming of chaotic systems needs tools that are deeply different from what have been used before. Stephen Wolfram's ideas presented
in "A New Kind of Science" can perhaps provide useful insight for this endeavour.

Another important countercultural aspect of the demoscene is the relationship with computing platforms. The mainstream regards platforms as neutral devices that can be used to reach a predefined result, while the demoscene regards them as a kind of raw material that has a specific essence of its own. Size categories may also split platforms into subplatforms, each of which has its own essence. The mainstream wants to hide platform-specific characteristics by encapsulating them into uniform straightjackets, while the demoscene is more keen to find suitable esthetical approaches for each category. In Borgmannian terms, demoscene practices are more focal.

Demoscene-inspired practices may not be the wisest choice for pragmatic software development. However, they can be recommended for the development of a deeper relationship with technology and for diminishing the alienating effects of our growth-obsessed civilization.

What to do?


I am convinced that our civilization is already falling and this fall cannot be prevented. What we can do, however, is create seeds for something better. Now is the best time for doing this, as we still have plenty of spare time and resources especially in rich countries. We especially need to propagate the seeds towards laypeople who are already suffering from increasing alienation because of the ever more computerized technological culture. The masses must realize that alternatives are possible.

A lot of our current civilization is constructed around the resource leak bug. We must therefore deconstruct the civilization down to its elementary philosophies and develop new alternatives. Countercultural insights may be useful here. And since hacker subcultures have been forced to deal with the resource leak bug in its most extreme manifestation for some time already, their input can be particularly valuable.

26 comments:

Dave said...

Your conclusion was derived in its essence 150 years ago by Karl Marx. He was right. So are you. Unfortunately you write like a gigantic wanker and nobody can read this without wanting to punch you

blank said...

disagree w angrydave; nice post. replicating the comment I made on ufblog.net about it:

‘limitlessness’ becomes delusional when encapsulated (ideology); difference between ‘cumulative growth’ in open and closed systems.
what even is a ‘closed’ system, other than one for which the really-encapsulating open system has been delusionally encapsulated (as a dependency on / assumption of some particular kind of limitlessness).
also, functions of division of physical labor & of intellectual labor are probably not analogous, considering theoretical comprehension as data compression, vs. physical economy of scale, which could be looked at as an expansion of the technical procedure-theorem (more like reciprocals than analogues).

please forgive (speculative) terseness.

Pinku-Sensei said...

You sound like you should be reading and hanging out at "The Archdruid Report," if you aren't already (I searched for comments by you as viznut, but didn't locate any). He agrees with you that our civilization is falling and that fall can't be stopped. He's also a fan of Wolfram's "A New Kind of Science." I think he would like the concept of "a resource leak bug" if you could explain it to him concisely enough if for no other reason than the sound of the phrase. It's catchy.

Anonymous said...

...been reading Guy Debord's 'Society of the Spectacle'?

Anonymous said...

so logical. thanks!

Anonymous said...

This is a fantastic blog post. I congratulate you. I must admit that I have reached the same conclusions, although I have probably taken a very different path in coming to them.

I could use this blog post as a starting point to write a book from. But I will limit myself to this comment.

I am a Forth programmer. It is a language which has been rejected by the mainstream of computing for a very long time now. I regard it as programming in the extensible macro assembler for a very simple virtual machine. It was invented in 1968, and even though it is old now, and has been rejected by the mainstream of computing society, it still has users, a few shops who develop and maintain commercial systems, some open source systems, and a community of developers, both has professionals and has hobbyists.

Forth philosophy emphasizes pragmatism, and being in tune and focused on the problem, rather than hiding behind encapsulation and software libraries.

Viznut, if you reading this, I encourage you to investigate Forth. If there is a way to contact you directly, I may be interested in corresponding with you briefly via email.

Jason

Anonymous said...

Viznut, Others,

Please excuse me while I read through your post again and offer additional comments. I just can't help myself. :^)

> today's wasteful use of computing resources

Something that I have heard people in the Forth community complain about before.

> I used to believe in the tradeoff hypothesis as well.

> abstraction clutter that serves no purpose in final runtime code.

and complain about that, and then suggest that software layers should be removed in favor of actually writting the code yourself.

> thoughtful

a term which I have heard Jeff Fox use, one of our most famous programmers, now deceased.

> What we have been witnessing in computing world is nothing utilitarian but a reflection of a more general, inherent wastefulness, that stems from the internal issues of contemporary human civilization.

I agree. I think that civilization must necessarily by the very foundation of what it is, by the most fundamental memetic DNA of what it is, take as many resources as possible. This is refelected in everything, including its technology infrastructure.

> Our mainstream economic system is oriented towards maximal production and growth.

As civilization must necessarily not allow anything in the universe to exist other than itself. It must outgrow everything else, and gobble up everything else.

> Using a software development metaphor

Which I have been doing myself for the past several years, as I have striven to understand what is going on.

> it can be said that our economic system has a fatal bug

not just the economic system, but all of it in its entirety

> A group that calls itself "Libertarians" even builds their ethics on it.

As an Ex-Libertarian, I will finally agree with this as well. But, I should also state that I have rejected Marxism as well as Fascism, and all other forms of civilizational thinking. I write my own worldview, and do not take up that of others. I just write my own code if you will. Another book which needs to be written.

> Another group called "Extropians" takes the maximization idea to the extreme by advocating an explosive expansion of humankind into outer space.

And yet another book which needs to be written. It is painful for me to still harbor the dream of Gerhard K. O'Neil while realizing that I don't think that I could live in a space colony with people like Elon Musk and Robert Zubrin. I would be afraid to. Who do you think would get to be the slave? I could also bring up Jamestown and what happened there. That is ugly.

> What happens if you give this buggy civilization a virtual world where the abundance of resources grows exponentially

I can think of Second Life. The resources are not infinite, but I don't like going in there. It's not a very nice place in my opinion. They use alot of horsepower for that game, and it still seems pretty clunky to me. And then there are the low lifes to deal with when you can find anybody at all to hang out with. It seems like a bad town at 4:00am in the morning to me.

Jason

Anonymous said...



> encapsulation: monolithic libraries
> ever-growing pile of potentially buggy intermediate layers.

sounds like Forth talk to me.

> automatic code transformations

I think that I'll start working on that after I get done with this post.

> "service interface"

or get more focal with software and hardware. That is what I would like. I'm going to "chop my own wood" so to speak. I ought to feel a little ashamed that it's really a sit down job though.

> "device paradigm"

What happens when in their country club Mars colony that their stuff breaks down and they are too busy partying such that they don't know how to repair their gear? Or, it is too complex to repair? But the replacement parts are on a space ship which won't arrive at Mars for another six months?

> technological alienation

I try to avoid it to some small degree by being a Forth programmer.

> Crafting with low-complexity computer and electronic systems

Forth

> extremely short programs

The inventer of Forth, Charles H. Moore, has designed and implemented his own CPU with a CAD system written in Forth. First he crafted his own Forth programming langauge which is also an operating system. Then using that he wrote the CAD system. Then he used the CAD system to design his own CPU, which is designed to run a version of Forth which is also the machine language of the chip. The CPU consists of 144 cores. Each core contains exactly 64 words of RAM and 64 words of ROM. Each memory cell of RAM can hold 3 or 4 Forth machine language instructions. You are expected to keep your program inside of 64 cells of RAM. If your program gets any bigger than that, you must factor your code to run on more than once CPU core. There is a simple mechanism for the cores to communicate with each other. There is even a way for one core to present instructions to another core for execution. This is a thing which exists in the real world. :^)

> Encapsulation, for example, is still quite prominent in the 4k category (4klang is a monolith)

I won't encapsulate. I have to admit that once in awhile I will slap together a website for a little bit of money. Whenever jQuerry is involved, I feel like I am being untrue to myself.

> demoscene regards them as a kind of raw material that has a specific essence of its own

I can understand that.

> Demoscene-inspired practices may not be the wisest choice for pragmatic software development.

Well in the demoscene using anything besides assembly would probably be looked down upon. Forth is programming for a virtual machine, but a simple one. Many Forth systems fit inside of 4K, but larger is more prominate now with 32 and 64 bit CPUS these days.

> A lot of our current civilization is constructed around the resource leak bug.

When I write my programs, I never use bounds checking. I just run and if my program crashes, I fix it so that one part of the program won't clobber another part of the program. That is fair to me.

Jason

zzo38 said...

Very good report.

I like Forth programming too, and I also like to write program on Nintendo Famicom (for demoscene purposes, it can be considered as NTSC NES in most (not quite all) cases).

I do like to write code myself; often in C for portability; for Famicom I will write in assembly language.

Using existing libraries (especially Unicode and web browsers) can be more than you need.

With free-software/open-source, the libraries are slightly less "chunked" than with proprietary DLL modules or whatever, since with free-software/open-source you can view the programs, if you need only one algorithm you can just learn it and put into your program, or you can use compile-time macros to deal with it, etc.

But of course there is also writing it by yourself, whether it is free-software or not!!

I don't use jQuery, Google CSS, etc; in fact when I do my own systems I generally won't even use HTML at all but when I do it is pretty simple and stuff.

Encapsulation and abstractions can sometimes help, but usually it just gets in the way. I know, because I have experiences with it, too. (I do sometimes use it: My OpenID implementation is based on an existing software library, although I wrote the login form myself; much simpler than most other implementations.)

Windows and Linux are both too complicated; DOS is simpler. Modern x86 is too complicated; 6502 is simpler.

Learn also esoteric programming; compare and contrast.

And then learn making CPU by computer hardware!

Allan said...

We are a leading software development services in delhi ,which works as per the client requirements and give provide software.

Anonymous said...

You might enjoy Peter Turchin's books on the rise and fall of civilizations, which have similar ideas.

If you do the calculation if population expands at 1% per annum, then humanity will outgrow a sphere of light expanding outwards from here+now within a few thousand years. Growth must trend to zero unless we discover infinite new dimensions or something.

buhrmi said...

JUST FIX THE FUCKING BUG. GOTTA FIX IT.

Anonymous said...

I love the firewood analogy. Very clever and astute. I used to program on the TI99/4A as well as Apple ][e (yes, that is how is was written) and stuff like that...Trash-80 too! ;) I recall doing this kind of thing, trying to maximize graphics (missile command was a good one to program) with limited space and speed. And I agree that the whole world is wasting so much....you gotta figure 99% of a modern computer is being wasted, when you look at things from the perspective of the author. Nice article anyways

Dodgy_Coder said...

Wastefulness can be addressed by using the best method of resource allocation known - the free market. Start by putting a price on carbon, charging people more for throwing out their garbage, and increase the reward for recycling. Its pretty simple, but its not very popular.

Anonymous said...

Great post.

But we the libertarians hope to achieve independence by chopping our own wood.

And this civilization will not fail because we have all the tools to communicate the solution and dictate it over the kleptocratic political system.

We have wikipedia for truth polishing and blockchain for peer to peer voting and commerce.

I think the path is to be politically involved but not in the egocentric/behind the scenes/kleptocratic hellenic way.

To be more clear I want to say that we use 2000 years old political system in the time of internet. It is time for proovable knowledge over the knowledge of the masses. Determinism over Conformism.

It is not important who say it or how he say it, but what the one says after all.

m50d said...

Your metaphor is unsupported. Resources are wasted because they are cheap - but they really are cheap. If efficiency of our computer programs was actually a problem, we would solve it - there are certainly companies that write more or less efficient programs, often competing in the same market. Turns out programmer time is usually more important, strapping together a bunch of existing components makes you more money than crafting from scratch because the result really is more valuable. If anything our ape instincts cause us to spend too much time crafting things carefully, when really we would be better off just making a disposable version.

M. Altemark said...

m50: Resources are not cheap, because computers run on the electric power grid, and the electric power grid as it is today is part of a giant, wasteful, market-managed capitalist economy which impossibly can sustain us or our children.

Sbate said...

I was experiencing this today - usually I call it existential angst but you really understand. There is a cutoff just like you cannot teach a dog a trick that is so many levels deep - we have a cutoff where we just stop caring. Also if you meet resistance you weigh the whole thing and give up. Like a realtor putting a for sale banner bigger than the open restaurants sign and you call the four foot high phone number and they think you are crazy for questioning them like I have no say so in the situation like I cannot have an opinion because I am not the owner of the building - it is a moral choice what you are talking about - making the decision to over build is the same as starving to death because your religion says you cannot eat shellfish or oysters. It is a decision to ignore our human nature or to use our flawed brain software systems against us. If you do look at the world rationally you get called a robot or kooky or mean. Also it is hard work. and why treat your tools well when you may have them stolen or you can get cheap replacements why not just give up

Morten said...

Once the crappy shit drive up the price of good shit there will be a larger market for good shit.

Anonymous said...

I think this bug is what inspires religious fundamentalism. They recognize the accelerating complexification that is not improving their lives but concurrently is consuming resources. If you fix this bug, you will strongly align human effort and significantly eliminate much of the friction leading to today's conflicts.

Anonymous said...

The goal is not to waste resources. The goal is to deskill coders: people who know how to glue widgets together are cheaper than people who can write clear tight code.

Clear tight code also requires knowing what you want beforehand. People don't seem to.

Anonymous said...

The goal is not to waste resources. The goal is to deskill coders: people who know how to glue widgets together are cheaper than people who can write clear tight code.

Clear tight code also requires knowing what you want beforehand. People don't seem to.

Anonymous said...

Nice post. Can be seen as analogous to modern global economics. Not mentioned is the fact that this lazy over-consumption of resources comes at a huge price to the earth as a whole (not only do we waste vast amounts of resources we are polluting our environment, we will die suffocating in our own waste). Before you think I've gone of topic please realize that energy wasted on our vast computing structure is consuming/polluting the world. At a cultural level the disconnect is plain to see in our children. They consume technology and have a superficially sophisticated relationship with it but within their peer groups they denigrate anyone trying to understand what's actually going on and have very vague ideas about the world in general (their activism, such as it is, focuses on narrow areas of sexual politics or simplistic populist transient issues).

Anonymous said...

I don't know, you seem to compare Wordpress to MapReduce on Hadoop over a 10,000 node cluster.

The reason why someone uses pre-built things is that the knowledge to build something like that is insane. You can spend 10 years of academic research in topics like databases and still not even get close to the performance of the fastest databases out there, just because they had 20 years and hundreds of developers and testers to improve and validate it.

The 64k / 8k / 4k scene example is also pathetic as the knowledge you acquire in these areas is so extremely specialized, it is completely useless in real-world applications. Or do you think a game like Unreal Tournament will use the same hacks and tricks as the 4k or 1k scene? Hell no, way too computationally expensive, impossible to debug, extremely limited in functionality. (why doesn't the 1k scene just build their own processor? resource leakers, pff...)

Yes, there is a degree of resource leak, but the issue is not the products out there, it's the speed at which stuff has to evolve ( Why on earth would I code my own Wordpress when I have to launch a website in 14 days? This leaks my a million times more important resource - my time) and much more important the skill of developers. There is a reason only a handful of people can even work on projects like Hadoop, HDFS, Kafka, Impala, Redis, Aerospike, etc etc. They are insanely complex and need a lot of very deep understanding of the programming language, the underlying operating system, networking and much more. Most of those projects also have specialists in each area, rarely does a single person build the entire software and make it extremely scalable and resilient, especially when networking is involved.

I.e. Ruby on Rails is extremely slow compared to highly optimized Java or C++. But it's easy. And in times where skilled developers are also an extremely limited source, the resource leak of CPUs is ok compared to just have nobody to code at all.

g said...

Trading off one resource versus another is a perennial problem in software engineering. The archetypal example is a time versus space tradeoff. No one answer is always best: it depends on whether you have more time or more space available compared to the other.

One tradeoff present here is CPU power versus programmer-hours. It's possible to spend large numbers of programmer-hours to write highly efficient and tightly-integrated monolithic code. But hardware is cheap and programmers are not. In most cases it is more productive overall to spend fewer programmer-hours to write slower, more loosely-coupled code and run it on faster hardware. The net result is more solved problems overall.

James said...

The comments on 6 and 11 December hint at the actual cause: humans are a tool using species. We are adapted to using black box tools developed by those who specialise in developing them to achieve our goals.